Sep 30 05:28:48 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 05:28:48 crc restorecon[4686]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:48 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 05:28:49 crc restorecon[4686]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 05:28:50 crc kubenswrapper[4956]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 05:28:50 crc kubenswrapper[4956]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 05:28:50 crc kubenswrapper[4956]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 05:28:50 crc kubenswrapper[4956]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 05:28:50 crc kubenswrapper[4956]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 05:28:50 crc kubenswrapper[4956]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.096930 4956 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103220 4956 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103618 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103624 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103628 4956 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103633 4956 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103637 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103642 4956 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103646 4956 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103650 4956 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103656 4956 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103661 4956 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103668 4956 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103680 4956 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103686 4956 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103692 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103697 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103701 4956 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103705 4956 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103710 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103713 4956 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103717 4956 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103721 4956 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103724 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103728 4956 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103732 4956 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103736 4956 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103740 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103744 4956 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103748 4956 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103752 4956 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103756 4956 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103760 4956 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103763 4956 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103767 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103771 4956 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103775 4956 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103779 4956 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103783 4956 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103788 4956 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103793 4956 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103798 4956 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103802 4956 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103813 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103817 4956 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103821 4956 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103825 4956 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103831 4956 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103836 4956 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103840 4956 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103844 4956 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103848 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103851 4956 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103855 4956 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103858 4956 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103862 4956 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103865 4956 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103868 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103872 4956 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103876 4956 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103879 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103882 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103886 4956 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103889 4956 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103893 4956 feature_gate.go:330] unrecognized feature gate: Example Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103896 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103899 4956 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103905 4956 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103910 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103914 4956 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103917 4956 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.103921 4956 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104745 4956 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104763 4956 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104794 4956 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104802 4956 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104807 4956 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104812 4956 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104818 4956 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104824 4956 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104829 4956 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104833 4956 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104838 4956 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104842 4956 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104846 4956 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104850 4956 flags.go:64] FLAG: --cgroup-root="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104854 4956 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104858 4956 flags.go:64] FLAG: --client-ca-file="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104862 4956 flags.go:64] FLAG: --cloud-config="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104866 4956 flags.go:64] FLAG: --cloud-provider="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104870 4956 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104878 4956 flags.go:64] FLAG: --cluster-domain="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104882 4956 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104887 4956 flags.go:64] FLAG: --config-dir="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104891 4956 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104902 4956 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104907 4956 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104911 4956 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104915 4956 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104920 4956 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104923 4956 flags.go:64] FLAG: --contention-profiling="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104927 4956 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104931 4956 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104936 4956 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104949 4956 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104954 4956 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104958 4956 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104962 4956 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104966 4956 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104971 4956 flags.go:64] FLAG: --enable-server="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104975 4956 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104981 4956 flags.go:64] FLAG: --event-burst="100" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104986 4956 flags.go:64] FLAG: --event-qps="50" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104990 4956 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104994 4956 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.104998 4956 flags.go:64] FLAG: --eviction-hard="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105003 4956 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105007 4956 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105011 4956 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105015 4956 flags.go:64] FLAG: --eviction-soft="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105019 4956 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105024 4956 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105028 4956 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105032 4956 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105036 4956 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105041 4956 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105045 4956 flags.go:64] FLAG: --feature-gates="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105050 4956 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105054 4956 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105058 4956 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105063 4956 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105067 4956 flags.go:64] FLAG: --healthz-port="10248" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105071 4956 flags.go:64] FLAG: --help="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105075 4956 flags.go:64] FLAG: --hostname-override="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105079 4956 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105083 4956 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105087 4956 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105091 4956 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105095 4956 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105099 4956 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105104 4956 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105108 4956 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105135 4956 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105140 4956 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105145 4956 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105150 4956 flags.go:64] FLAG: --kube-reserved="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105156 4956 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105160 4956 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105164 4956 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105168 4956 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105172 4956 flags.go:64] FLAG: --lock-file="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105177 4956 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105182 4956 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105186 4956 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105192 4956 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105196 4956 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105200 4956 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105204 4956 flags.go:64] FLAG: --logging-format="text" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105208 4956 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105213 4956 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105217 4956 flags.go:64] FLAG: --manifest-url="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105221 4956 flags.go:64] FLAG: --manifest-url-header="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105227 4956 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105231 4956 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105236 4956 flags.go:64] FLAG: --max-pods="110" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105240 4956 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105244 4956 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105248 4956 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105253 4956 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105257 4956 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105261 4956 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105265 4956 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105275 4956 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105282 4956 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105286 4956 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105290 4956 flags.go:64] FLAG: --pod-cidr="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105297 4956 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105303 4956 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105307 4956 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105312 4956 flags.go:64] FLAG: --pods-per-core="0" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105316 4956 flags.go:64] FLAG: --port="10250" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105320 4956 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105324 4956 flags.go:64] FLAG: --provider-id="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105328 4956 flags.go:64] FLAG: --qos-reserved="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105332 4956 flags.go:64] FLAG: --read-only-port="10255" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105336 4956 flags.go:64] FLAG: --register-node="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105339 4956 flags.go:64] FLAG: --register-schedulable="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105344 4956 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105350 4956 flags.go:64] FLAG: --registry-burst="10" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105354 4956 flags.go:64] FLAG: --registry-qps="5" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105358 4956 flags.go:64] FLAG: --reserved-cpus="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105362 4956 flags.go:64] FLAG: --reserved-memory="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105371 4956 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105376 4956 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105381 4956 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105385 4956 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105390 4956 flags.go:64] FLAG: --runonce="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105395 4956 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105401 4956 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105405 4956 flags.go:64] FLAG: --seccomp-default="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105409 4956 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105413 4956 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105418 4956 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105422 4956 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105426 4956 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105430 4956 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105435 4956 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105439 4956 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105444 4956 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105448 4956 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105453 4956 flags.go:64] FLAG: --system-cgroups="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105457 4956 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105464 4956 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105468 4956 flags.go:64] FLAG: --tls-cert-file="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105472 4956 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105477 4956 flags.go:64] FLAG: --tls-min-version="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105481 4956 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105485 4956 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105489 4956 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105493 4956 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105497 4956 flags.go:64] FLAG: --v="2" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105503 4956 flags.go:64] FLAG: --version="false" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105508 4956 flags.go:64] FLAG: --vmodule="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105513 4956 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105517 4956 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105624 4956 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105631 4956 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105638 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105643 4956 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105647 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105652 4956 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105656 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105660 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105665 4956 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105670 4956 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105675 4956 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105679 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105683 4956 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105688 4956 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105691 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105696 4956 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105701 4956 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105704 4956 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105708 4956 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105712 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105716 4956 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105719 4956 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105723 4956 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105728 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105732 4956 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105736 4956 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105739 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105743 4956 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105746 4956 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105750 4956 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105753 4956 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105757 4956 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105760 4956 feature_gate.go:330] unrecognized feature gate: Example Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105764 4956 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105769 4956 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105772 4956 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105776 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105779 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105782 4956 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105786 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105789 4956 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105793 4956 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105796 4956 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105800 4956 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105803 4956 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105808 4956 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105812 4956 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105815 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105818 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105822 4956 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105825 4956 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105829 4956 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105833 4956 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105836 4956 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105839 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105843 4956 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105848 4956 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105851 4956 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105855 4956 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105860 4956 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105864 4956 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105868 4956 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105871 4956 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105875 4956 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105880 4956 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105884 4956 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105890 4956 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105894 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105897 4956 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105901 4956 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.105905 4956 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.105917 4956 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.117028 4956 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.117079 4956 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117275 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117291 4956 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117300 4956 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117309 4956 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117317 4956 feature_gate.go:330] unrecognized feature gate: Example Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117325 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117333 4956 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117341 4956 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117349 4956 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117357 4956 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117365 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117373 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117382 4956 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117391 4956 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117399 4956 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117408 4956 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117417 4956 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117425 4956 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117433 4956 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117441 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117449 4956 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117460 4956 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117475 4956 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117486 4956 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117495 4956 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117503 4956 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117514 4956 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117525 4956 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117533 4956 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117542 4956 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117550 4956 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117559 4956 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117568 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117576 4956 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117586 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117594 4956 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117602 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117610 4956 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117618 4956 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117625 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117635 4956 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117645 4956 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117652 4956 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117660 4956 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117668 4956 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117676 4956 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117684 4956 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117692 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117699 4956 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117707 4956 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117716 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117724 4956 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117732 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117740 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117748 4956 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117755 4956 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117763 4956 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117771 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117778 4956 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117786 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117794 4956 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117803 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117811 4956 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117819 4956 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117826 4956 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117834 4956 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117841 4956 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117850 4956 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117857 4956 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117865 4956 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.117873 4956 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.117888 4956 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118236 4956 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118251 4956 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118260 4956 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118269 4956 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118277 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118285 4956 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118293 4956 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118300 4956 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118308 4956 feature_gate.go:330] unrecognized feature gate: Example Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118316 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118327 4956 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118339 4956 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118348 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118357 4956 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118365 4956 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118373 4956 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118382 4956 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118391 4956 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118400 4956 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118409 4956 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118417 4956 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118425 4956 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118433 4956 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118440 4956 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118449 4956 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118457 4956 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118464 4956 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118472 4956 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118480 4956 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118487 4956 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118495 4956 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118503 4956 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118511 4956 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118519 4956 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118528 4956 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118536 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118546 4956 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118556 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118564 4956 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118572 4956 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118582 4956 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118590 4956 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118599 4956 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118607 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118615 4956 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118623 4956 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118631 4956 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118640 4956 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118648 4956 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118656 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118664 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118672 4956 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118680 4956 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118688 4956 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118697 4956 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118707 4956 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118716 4956 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118725 4956 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118732 4956 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118740 4956 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118747 4956 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118755 4956 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118762 4956 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118771 4956 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118779 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118786 4956 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118794 4956 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118801 4956 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118809 4956 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118817 4956 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.118825 4956 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.118839 4956 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.119309 4956 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.129330 4956 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.129488 4956 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.132105 4956 server.go:997] "Starting client certificate rotation" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.132181 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.132849 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-16 03:57:56.107931838 +0000 UTC Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.132971 4956 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2590h29m5.974965608s for next certificate rotation Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.159874 4956 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.163171 4956 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.179524 4956 log.go:25] "Validated CRI v1 runtime API" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.220938 4956 log.go:25] "Validated CRI v1 image API" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.222641 4956 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.229013 4956 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-05-23-26-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.229060 4956 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.245761 4956 manager.go:217] Machine: {Timestamp:2025-09-30 05:28:50.243980348 +0000 UTC m=+0.571100893 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799886 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:623396aa-9d66-4fad-a73b-3a90f4645680 BootID:a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bf:31:da Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bf:31:da Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e9:a5:bf Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5c:d1:69 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f4:4e:83 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a4:fe:85 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:42:49:7f:c3:0b:ff Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d2:2f:86:2e:5a:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.246080 4956 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.246263 4956 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.248064 4956 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.248390 4956 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.248453 4956 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.248757 4956 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.248770 4956 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.249348 4956 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.249386 4956 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.249643 4956 state_mem.go:36] "Initialized new in-memory state store" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.249762 4956 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.254900 4956 kubelet.go:418] "Attempting to sync node with API server" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.254930 4956 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.254982 4956 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.255015 4956 kubelet.go:324] "Adding apiserver pod source" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.255067 4956 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.259500 4956 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.260611 4956 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.263599 4956 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.264965 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265038 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265098 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265124 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265169 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265182 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265190 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265205 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265218 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265231 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265301 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.265312 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.266803 4956 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.266725 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.266922 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.267259 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.267313 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.268908 4956 server.go:1280] "Started kubelet" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.269020 4956 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.269272 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.270203 4956 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.271075 4956 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 05:28:50 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.272405 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.272454 4956 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.272889 4956 server.go:460] "Adding debug handlers to kubelet server" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.273410 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 05:43:31.984391986 +0000 UTC Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.273461 4956 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1896h14m41.710934947s for next certificate rotation Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.274841 4956 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.274980 4956 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.275031 4956 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.274837 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.275741 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.275821 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.276048 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869f84af3f196b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 05:28:50.268853944 +0000 UTC m=+0.595974509,LastTimestamp:2025-09-30 05:28:50.268853944 +0000 UTC m=+0.595974509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.283873 4956 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.283913 4956 factory.go:55] Registering systemd factory Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.283930 4956 factory.go:221] Registration of the systemd container factory successfully Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.285055 4956 factory.go:153] Registering CRI-O factory Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.285107 4956 factory.go:221] Registration of the crio container factory successfully Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.285158 4956 factory.go:103] Registering Raw factory Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.285186 4956 manager.go:1196] Started watching for new ooms in manager Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.286076 4956 manager.go:319] Starting recovery of all containers Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.287341 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="200ms" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.290823 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.290912 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.290940 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.290963 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.290983 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291003 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291023 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291075 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291098 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291147 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291173 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291195 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291213 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291238 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291256 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291277 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291296 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291314 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291336 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291355 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291373 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.291392 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298451 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298500 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298563 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298596 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298655 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298694 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298737 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298779 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298823 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298857 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.298902 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.302653 4956 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.302801 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303355 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303395 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303416 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303446 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303463 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303487 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303525 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303552 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303583 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303602 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303626 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303642 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303674 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303696 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303715 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303735 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303750 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303767 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303799 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303822 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303855 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303881 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303900 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303921 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303942 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303963 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.303979 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.304015 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.304031 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.304085 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.304105 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.304152 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.304175 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.304215 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305039 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305167 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305202 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305225 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305245 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305269 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305289 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305311 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305333 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305353 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305372 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305393 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305413 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305431 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305452 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305473 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305492 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305513 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305534 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305554 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305575 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305595 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305616 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305636 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305657 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305678 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305700 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305722 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305742 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305763 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305785 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305805 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305828 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305847 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305866 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305886 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305919 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305945 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305970 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.305996 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306023 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306046 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306071 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306091 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306140 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306164 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306194 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306214 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306237 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306257 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306281 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306302 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306325 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306346 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306368 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306392 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306417 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306436 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306458 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306478 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306500 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306521 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306546 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306567 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306588 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306609 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306631 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306651 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306671 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306693 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306713 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306733 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306756 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306777 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306798 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306818 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306839 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306861 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306882 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306902 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306923 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306943 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306963 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.306981 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307001 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307021 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307041 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307061 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307083 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307105 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307156 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307176 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307195 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307217 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307237 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307257 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307277 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307300 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307322 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307342 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307363 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307382 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307405 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307426 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307449 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307471 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307491 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307706 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307729 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307750 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307771 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307793 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307813 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307835 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307857 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307878 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307900 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307922 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307943 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307963 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.307984 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308003 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308026 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308045 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308067 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308088 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308108 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308153 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308176 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308196 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308216 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308237 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308260 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308282 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308305 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308327 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308348 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308371 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308391 4956 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308414 4956 reconstruct.go:97] "Volume reconstruction finished" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.308428 4956 reconciler.go:26] "Reconciler: start to sync state" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.321737 4956 manager.go:324] Recovery completed Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.337741 4956 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.339291 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.339651 4956 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.339707 4956 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.339740 4956 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.339794 4956 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.340858 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.340956 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: W0930 05:28:50.340903 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.341060 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.341018 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.344044 4956 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.344079 4956 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.344155 4956 state_mem.go:36] "Initialized new in-memory state store" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.367687 4956 policy_none.go:49] "None policy: Start" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.371418 4956 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.371451 4956 state_mem.go:35] "Initializing new in-memory state store" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.375600 4956 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.440504 4956 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.443144 4956 manager.go:334] "Starting Device Plugin manager" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.443201 4956 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.443218 4956 server.go:79] "Starting device plugin registration server" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.443723 4956 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.443747 4956 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.445681 4956 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.445823 4956 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.445833 4956 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.456709 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.488187 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="400ms" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.544892 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.546172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.546220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.546235 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.546267 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.546940 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.641159 4956 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.641286 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.643058 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.643092 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.643101 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.643220 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.643529 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.643600 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.644101 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.644151 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.644167 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.644689 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.644948 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.644984 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.645270 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.645295 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.645302 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.646053 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.646078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.646095 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.646257 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.646490 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.646576 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.649764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.649789 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.649799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.649905 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650387 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650423 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650456 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650464 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650489 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650388 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650666 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650682 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.650957 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.651021 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.651057 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.651360 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.651414 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.651801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.651851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.651874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.652553 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.652582 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.652594 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715331 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715390 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715427 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715459 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715489 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715520 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715551 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715580 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715608 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715636 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715663 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715691 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715734 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715774 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.715806 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.747096 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.748245 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.748287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.748301 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.748350 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.748921 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817505 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817597 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817624 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817665 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817685 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817703 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817727 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817751 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817770 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817764 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817802 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817842 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817868 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817793 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817874 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817915 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817896 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817762 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817883 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817947 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.817976 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818046 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818061 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818105 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818162 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818179 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818242 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818245 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818286 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.818343 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:50 crc kubenswrapper[4956]: E0930 05:28:50.889493 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="800ms" Sep 30 05:28:50 crc kubenswrapper[4956]: I0930 05:28:50.998825 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.010561 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.028320 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.045881 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.050992 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1133f1c7c8d9821d05048d846be916d299e771fd1fdfe0435becf415c5338f87 WatchSource:0}: Error finding container 1133f1c7c8d9821d05048d846be916d299e771fd1fdfe0435becf415c5338f87: Status 404 returned error can't find the container with id 1133f1c7c8d9821d05048d846be916d299e771fd1fdfe0435becf415c5338f87 Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.052552 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.058826 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-15cc9dc15a3c9838aee88e6d58c45f38e228c5ef3e035009be5d59f35921359f WatchSource:0}: Error finding container 15cc9dc15a3c9838aee88e6d58c45f38e228c5ef3e035009be5d59f35921359f: Status 404 returned error can't find the container with id 15cc9dc15a3c9838aee88e6d58c45f38e228c5ef3e035009be5d59f35921359f Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.059248 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d6f6b317a2c5ea39415958f8feab1104970ed060492ac3ca87924436e3a2e8fe WatchSource:0}: Error finding container d6f6b317a2c5ea39415958f8feab1104970ed060492ac3ca87924436e3a2e8fe: Status 404 returned error can't find the container with id d6f6b317a2c5ea39415958f8feab1104970ed060492ac3ca87924436e3a2e8fe Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.069360 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-97472cc41b65fe848145c65518446d2c9d5e06967518317b9892b01d9a494906 WatchSource:0}: Error finding container 97472cc41b65fe848145c65518446d2c9d5e06967518317b9892b01d9a494906: Status 404 returned error can't find the container with id 97472cc41b65fe848145c65518446d2c9d5e06967518317b9892b01d9a494906 Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.090851 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-731494f705f8dab6031d1503d8c25d52dca0c2a13cb55c17012fd9a78acb0eb1 WatchSource:0}: Error finding container 731494f705f8dab6031d1503d8c25d52dca0c2a13cb55c17012fd9a78acb0eb1: Status 404 returned error can't find the container with id 731494f705f8dab6031d1503d8c25d52dca0c2a13cb55c17012fd9a78acb0eb1 Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.149482 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.151920 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.152035 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.152056 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.152275 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 05:28:51 crc kubenswrapper[4956]: E0930 05:28:51.153423 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.270458 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.275473 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:51 crc kubenswrapper[4956]: E0930 05:28:51.275604 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.337829 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:51 crc kubenswrapper[4956]: E0930 05:28:51.337966 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.344203 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"15cc9dc15a3c9838aee88e6d58c45f38e228c5ef3e035009be5d59f35921359f"} Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.346212 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d6f6b317a2c5ea39415958f8feab1104970ed060492ac3ca87924436e3a2e8fe"} Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.347777 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1133f1c7c8d9821d05048d846be916d299e771fd1fdfe0435becf415c5338f87"} Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.349222 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"731494f705f8dab6031d1503d8c25d52dca0c2a13cb55c17012fd9a78acb0eb1"} Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.350544 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"97472cc41b65fe848145c65518446d2c9d5e06967518317b9892b01d9a494906"} Sep 30 05:28:51 crc kubenswrapper[4956]: E0930 05:28:51.691858 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="1.6s" Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.796380 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:51 crc kubenswrapper[4956]: E0930 05:28:51.796915 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:51 crc kubenswrapper[4956]: W0930 05:28:51.803769 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:51 crc kubenswrapper[4956]: E0930 05:28:51.803828 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.954071 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.955720 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.955748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.955756 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:51 crc kubenswrapper[4956]: I0930 05:28:51.955776 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 05:28:51 crc kubenswrapper[4956]: E0930 05:28:51.956198 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.269995 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.354912 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118" exitCode=0 Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.354987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118"} Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.355017 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.355748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.355772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.355781 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.357540 4956 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c" exitCode=0 Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.357671 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.357720 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c"} Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.358132 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.358495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.358520 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.358528 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.359099 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.359128 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.359136 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.360946 4956 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0" exitCode=0 Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.361011 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0"} Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.361036 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.366409 4956 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a" exitCode=0 Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.366438 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a"} Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.366497 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.366596 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.367251 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.367268 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.367615 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.367653 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.367673 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.373004 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575"} Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.373043 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9"} Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.373059 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018"} Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.373071 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3"} Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.373177 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.373921 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.373954 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:52 crc kubenswrapper[4956]: I0930 05:28:52.373965 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:52 crc kubenswrapper[4956]: E0930 05:28:52.546604 4956 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.82:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869f84af3f196b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 05:28:50.268853944 +0000 UTC m=+0.595974509,LastTimestamp:2025-09-30 05:28:50.268853944 +0000 UTC m=+0.595974509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.081327 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.269871 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:53 crc kubenswrapper[4956]: E0930 05:28:53.292951 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.82:6443: connect: connection refused" interval="3.2s" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.381862 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.381914 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.381926 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.381927 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.383080 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.383107 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.383128 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.386618 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.386653 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.386666 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.386679 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.388605 4956 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c" exitCode=0 Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.388655 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.388813 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.391929 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.391976 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.392000 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.394306 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0479f4bb7141ac2e5f5eda2994f1bca6f2b3ded35fce60581a8428858575bf4d"} Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.394459 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.395727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.395758 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.395768 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.395992 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.399581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.399609 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.399618 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.556885 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.557886 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.557979 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.558035 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:53 crc kubenswrapper[4956]: I0930 05:28:53.558107 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 05:28:53 crc kubenswrapper[4956]: E0930 05:28:53.558520 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.82:6443: connect: connection refused" node="crc" Sep 30 05:28:53 crc kubenswrapper[4956]: W0930 05:28:53.843942 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.82:6443: connect: connection refused Sep 30 05:28:53 crc kubenswrapper[4956]: E0930 05:28:53.844037 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.82:6443: connect: connection refused" logger="UnhandledError" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.402441 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b"} Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.402507 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.403775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.403817 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.403833 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.404803 4956 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215" exitCode=0 Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.404892 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215"} Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.404942 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.404987 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.405732 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.405810 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.405818 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.405961 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.405988 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.406004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.406567 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.406603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.406620 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.407689 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.407732 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.407748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.407825 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.407851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.407868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:54 crc kubenswrapper[4956]: I0930 05:28:54.827142 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.412049 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2"} Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.412097 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.412130 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba"} Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.412152 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0"} Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.412166 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.412171 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.412167 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4"} Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.413496 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.413527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.413541 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.413502 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.413741 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:55 crc kubenswrapper[4956]: I0930 05:28:55.413753 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.152003 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.152180 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.153281 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.153324 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.153338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.156269 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.418528 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1"} Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.418628 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.418658 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.418678 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.419271 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.419465 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.419495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.419507 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.419727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.419779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.419800 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.420272 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.420304 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.420313 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.759324 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.761008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.761073 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.761097 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.761179 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 05:28:56 crc kubenswrapper[4956]: I0930 05:28:56.890676 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 05:28:57 crc kubenswrapper[4956]: I0930 05:28:57.423769 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:57 crc kubenswrapper[4956]: I0930 05:28:57.425337 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:57 crc kubenswrapper[4956]: I0930 05:28:57.425406 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:57 crc kubenswrapper[4956]: I0930 05:28:57.425426 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.426863 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.428073 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.428180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.428205 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.947887 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.948061 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.949228 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.949252 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:58 crc kubenswrapper[4956]: I0930 05:28:58.949260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:28:59 crc kubenswrapper[4956]: I0930 05:28:59.024097 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:28:59 crc kubenswrapper[4956]: I0930 05:28:59.429511 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:28:59 crc kubenswrapper[4956]: I0930 05:28:59.430571 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:28:59 crc kubenswrapper[4956]: I0930 05:28:59.430606 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:28:59 crc kubenswrapper[4956]: I0930 05:28:59.430614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:00 crc kubenswrapper[4956]: E0930 05:29:00.456822 4956 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 05:29:00 crc kubenswrapper[4956]: I0930 05:29:00.755084 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:29:00 crc kubenswrapper[4956]: I0930 05:29:00.755279 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:29:00 crc kubenswrapper[4956]: I0930 05:29:00.756229 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:00 crc kubenswrapper[4956]: I0930 05:29:00.756255 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:00 crc kubenswrapper[4956]: I0930 05:29:00.756263 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.209486 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.210191 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.211865 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.211902 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.211912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.825850 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.826076 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.827570 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.827601 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.827612 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:01 crc kubenswrapper[4956]: I0930 05:29:01.830198 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:29:02 crc kubenswrapper[4956]: I0930 05:29:02.437755 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:29:02 crc kubenswrapper[4956]: I0930 05:29:02.438871 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:02 crc kubenswrapper[4956]: I0930 05:29:02.438912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:02 crc kubenswrapper[4956]: I0930 05:29:02.438922 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:03 crc kubenswrapper[4956]: W0930 05:29:03.974139 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 05:29:03 crc kubenswrapper[4956]: I0930 05:29:03.974579 4956 trace.go:236] Trace[1251952917]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 05:28:53.972) (total time: 10001ms): Sep 30 05:29:03 crc kubenswrapper[4956]: Trace[1251952917]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:29:03.974) Sep 30 05:29:03 crc kubenswrapper[4956]: Trace[1251952917]: [10.001948785s] [10.001948785s] END Sep 30 05:29:03 crc kubenswrapper[4956]: E0930 05:29:03.974615 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 05:29:04 crc kubenswrapper[4956]: W0930 05:29:04.183397 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.183519 4956 trace.go:236] Trace[1501907512]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 05:28:54.182) (total time: 10000ms): Sep 30 05:29:04 crc kubenswrapper[4956]: Trace[1501907512]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (05:29:04.183) Sep 30 05:29:04 crc kubenswrapper[4956]: Trace[1501907512]: [10.000983282s] [10.000983282s] END Sep 30 05:29:04 crc kubenswrapper[4956]: E0930 05:29:04.183553 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.270557 4956 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 05:29:04 crc kubenswrapper[4956]: W0930 05:29:04.326249 4956 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.326348 4956 trace.go:236] Trace[997670433]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 05:28:54.324) (total time: 10001ms): Sep 30 05:29:04 crc kubenswrapper[4956]: Trace[997670433]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:29:04.326) Sep 30 05:29:04 crc kubenswrapper[4956]: Trace[997670433]: [10.001691546s] [10.001691546s] END Sep 30 05:29:04 crc kubenswrapper[4956]: E0930 05:29:04.326384 4956 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.407484 4956 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.407555 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.412818 4956 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.412897 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.443140 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.445038 4956 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b" exitCode=255 Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.445091 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b"} Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.445262 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.446416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.446438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.446447 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.447063 4956 scope.go:117] "RemoveContainer" containerID="83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b" Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.826771 4956 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 05:29:04 crc kubenswrapper[4956]: I0930 05:29:04.826855 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 05:29:05 crc kubenswrapper[4956]: I0930 05:29:05.449531 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 05:29:05 crc kubenswrapper[4956]: I0930 05:29:05.451329 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3"} Sep 30 05:29:05 crc kubenswrapper[4956]: I0930 05:29:05.451508 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:29:05 crc kubenswrapper[4956]: I0930 05:29:05.452580 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:05 crc kubenswrapper[4956]: I0930 05:29:05.452618 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:05 crc kubenswrapper[4956]: I0930 05:29:05.452631 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:07 crc kubenswrapper[4956]: I0930 05:29:07.833865 4956 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.265865 4956 apiserver.go:52] "Watching apiserver" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.271236 4956 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.271475 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.271821 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.271859 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:08 crc kubenswrapper[4956]: E0930 05:29:08.271897 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.272062 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.272142 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:08 crc kubenswrapper[4956]: E0930 05:29:08.272229 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.272276 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.272602 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:08 crc kubenswrapper[4956]: E0930 05:29:08.272850 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.274638 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.274920 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.274940 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.274940 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.274973 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.274924 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.275065 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.275079 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.275132 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.275783 4956 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.321167 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.333137 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.349857 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.360832 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.369054 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.380725 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.390013 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.398205 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.514027 4956 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.948144 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:29:08 crc kubenswrapper[4956]: I0930 05:29:08.960756 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.028515 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.041181 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.049084 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.057641 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.065446 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.078657 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.088173 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.099390 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.311757 4956 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.391012 4956 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.396443 4956 trace.go:236] Trace[1479186940]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 05:28:58.845) (total time: 10550ms): Sep 30 05:29:09 crc kubenswrapper[4956]: Trace[1479186940]: ---"Objects listed" error: 10550ms (05:29:09.396) Sep 30 05:29:09 crc kubenswrapper[4956]: Trace[1479186940]: [10.550616056s] [10.550616056s] END Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.396716 4956 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.397273 4956 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.397655 4956 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.463560 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.473195 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.481615 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.492596 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.497832 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498081 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498207 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498312 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498196 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498377 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498394 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498543 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498623 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498657 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498842 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.498949 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499041 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499145 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499222 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499258 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499308 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499375 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499394 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499393 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499432 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499449 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499465 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499468 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499480 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499522 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499523 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499531 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499547 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499572 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499585 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499594 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499618 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499631 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499640 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499642 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499664 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499674 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499710 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499727 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499730 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499738 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499750 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499747 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499789 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499804 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499836 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499858 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499886 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499909 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499931 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499953 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499977 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500004 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500030 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499809 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500054 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499857 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500064 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499914 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.499928 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500045 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500077 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500136 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500109 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500157 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500175 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500195 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500211 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500222 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500229 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500247 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500252 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500266 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500300 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500316 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500335 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500352 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500367 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500383 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500405 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500420 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500435 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500451 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500467 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500481 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500498 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500543 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500562 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500578 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500614 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500630 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500646 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500662 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500678 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500694 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500709 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500725 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500761 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500779 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500797 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500813 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500830 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500844 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500889 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500905 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500920 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500938 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500957 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501025 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501048 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501069 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501089 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501103 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501132 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501148 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501168 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501183 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501201 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501217 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501232 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501248 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501264 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501278 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501293 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501308 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501326 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501341 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501356 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501371 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501385 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501400 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501422 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501436 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501453 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501468 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501483 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501497 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501513 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501529 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501546 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501562 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501576 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501592 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501623 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501637 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501653 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501666 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501681 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501696 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501711 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501726 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501744 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501760 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501775 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501792 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501807 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501822 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501838 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501854 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501868 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501884 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501899 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501913 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501931 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501948 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501966 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501981 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501997 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502013 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502030 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502045 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502062 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502079 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502097 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502125 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502142 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502157 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502174 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502189 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502206 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502222 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502238 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502253 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502270 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502286 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502301 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502317 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502336 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502352 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502375 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502392 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502407 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502423 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502440 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502457 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502474 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502489 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502506 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502522 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502540 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502557 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502573 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502588 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502604 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502621 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502638 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502654 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502669 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502686 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502702 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502747 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502764 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502783 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502800 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502817 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502832 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502847 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502868 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502885 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502902 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502917 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502961 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502983 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503004 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503026 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503047 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503066 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503083 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503102 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503467 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503488 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503505 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503526 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503543 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503560 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503615 4956 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503626 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503637 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503647 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503656 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503666 4956 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503677 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503688 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503697 4956 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503707 4956 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503717 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503727 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503736 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503745 4956 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503755 4956 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503764 4956 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503773 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503782 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503792 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503800 4956 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503810 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503820 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503830 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503840 4956 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503849 4956 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503858 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503867 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.504488 4956 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508272 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.519027 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.519221 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500296 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500317 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500374 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500393 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500437 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500477 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500519 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500546 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500594 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500608 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500652 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500744 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500756 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500806 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500882 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500962 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.500972 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501029 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501294 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.501836 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502043 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502149 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502245 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502311 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502325 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502425 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502483 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502982 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.502987 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503080 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503211 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503416 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503565 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503578 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503805 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503819 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.503935 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.504063 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.504229 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.504342 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.505219 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.506340 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.506737 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.506797 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.506906 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.506918 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.521504 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.506925 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.506142 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.507232 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.507453 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.507465 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.507509 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.507533 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.507676 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.507843 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508248 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508268 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508287 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508379 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508584 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508623 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508670 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508794 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508857 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.508544 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.509163 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.509187 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.509385 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.509649 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510061 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510212 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510223 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510299 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510443 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510519 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510529 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510537 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.510955 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.511038 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.511084 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.511155 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.511521 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.511564 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.511758 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.511792 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.511923 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512183 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512326 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512331 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512588 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512693 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512794 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512968 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512980 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.512990 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.513001 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.513063 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.513283 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.513546 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.513562 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.513677 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.516213 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.516233 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.517660 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:29:10.017634267 +0000 UTC m=+20.344754792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.517796 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.517969 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.518502 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.518547 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.518647 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.519079 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.519188 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.519767 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.520158 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.520256 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.520314 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.520604 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.520622 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.520659 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.520898 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.525656 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.526219 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.526320 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.526381 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:10.026364671 +0000 UTC m=+20.353485316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.526521 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.526578 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:10.026563948 +0000 UTC m=+20.353684473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.532064 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.532489 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.532524 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.532542 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.532607 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:10.032588211 +0000 UTC m=+20.359708736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.534765 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.536075 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.540706 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.541363 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.541489 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.545123 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.545991 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.546017 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.546038 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:09 crc kubenswrapper[4956]: E0930 05:29:09.546100 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:10.046078967 +0000 UTC m=+20.373199492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.547074 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.547188 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.547397 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.547766 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.550542 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.551822 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.552172 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.555535 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.559107 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-htk97"] Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.559444 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-htk97" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.559587 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.560634 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.561539 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.561738 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.561743 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.565207 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.565455 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.566035 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.566724 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.566656 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.567761 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.567823 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.569556 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.570092 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.570206 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.570060 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.571437 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.571503 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.571569 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.571623 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.572282 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.572483 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.572502 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.572868 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.573489 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.573591 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.573661 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.574989 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.576517 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.576531 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.576832 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.578460 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.578478 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.579474 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.580388 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.581144 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.581543 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.581728 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.582916 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.582937 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.582746 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.582725 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.583784 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.590593 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.590820 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.595546 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.598062 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604533 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da40fd61-e4f1-4780-bf28-5dd931e1a265-hosts-file\") pod \"node-resolver-htk97\" (UID: \"da40fd61-e4f1-4780-bf28-5dd931e1a265\") " pod="openshift-dns/node-resolver-htk97" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604562 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjqdh\" (UniqueName: \"kubernetes.io/projected/da40fd61-e4f1-4780-bf28-5dd931e1a265-kube-api-access-hjqdh\") pod \"node-resolver-htk97\" (UID: \"da40fd61-e4f1-4780-bf28-5dd931e1a265\") " pod="openshift-dns/node-resolver-htk97" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604601 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604621 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604684 4956 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604696 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604811 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604805 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604842 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604923 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604951 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604965 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604978 4956 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.604990 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605003 4956 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605015 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605029 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605041 4956 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605053 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605064 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605077 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605089 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605166 4956 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605180 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605191 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605204 4956 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605218 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605230 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605242 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605255 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605267 4956 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605280 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605293 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605305 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605316 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605330 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605342 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605354 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605366 4956 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605379 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605391 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605403 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605478 4956 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605495 4956 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605627 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605644 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605654 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605664 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605675 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605685 4956 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605697 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605708 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605719 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605729 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605741 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605751 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605760 4956 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605771 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605781 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605792 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605803 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605813 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605822 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605833 4956 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605846 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605857 4956 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605866 4956 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605876 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605885 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605896 4956 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605908 4956 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605919 4956 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605932 4956 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605944 4956 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605955 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605968 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605979 4956 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.605990 4956 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606002 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606013 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606023 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606035 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606045 4956 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606056 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606066 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606076 4956 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606086 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606109 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606132 4956 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606141 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606149 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606157 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606166 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606175 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606184 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606193 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606200 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606210 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606263 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606342 4956 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606353 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606362 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606370 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606379 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606388 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606396 4956 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606407 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606419 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606430 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606441 4956 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606452 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606462 4956 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606480 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606490 4956 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606500 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606510 4956 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606520 4956 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606530 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606542 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606550 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606558 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606567 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606575 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606583 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606591 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606599 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606612 4956 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606621 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606629 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606637 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606647 4956 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606690 4956 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606699 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606707 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606715 4956 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606723 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606778 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606789 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606798 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606807 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606815 4956 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606824 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606835 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606844 4956 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606851 4956 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606859 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606867 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606875 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606883 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606891 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606900 4956 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606907 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606918 4956 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606928 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606936 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606945 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606953 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606961 4956 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606970 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606978 4956 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606986 4956 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.606994 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607002 4956 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607012 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607032 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607041 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607048 4956 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607057 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607066 4956 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607074 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607082 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607090 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607098 4956 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.607163 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.617588 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.626983 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.635508 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.707657 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da40fd61-e4f1-4780-bf28-5dd931e1a265-hosts-file\") pod \"node-resolver-htk97\" (UID: \"da40fd61-e4f1-4780-bf28-5dd931e1a265\") " pod="openshift-dns/node-resolver-htk97" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.707711 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjqdh\" (UniqueName: \"kubernetes.io/projected/da40fd61-e4f1-4780-bf28-5dd931e1a265-kube-api-access-hjqdh\") pod \"node-resolver-htk97\" (UID: \"da40fd61-e4f1-4780-bf28-5dd931e1a265\") " pod="openshift-dns/node-resolver-htk97" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.708022 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da40fd61-e4f1-4780-bf28-5dd931e1a265-hosts-file\") pod \"node-resolver-htk97\" (UID: \"da40fd61-e4f1-4780-bf28-5dd931e1a265\") " pod="openshift-dns/node-resolver-htk97" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.722512 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjqdh\" (UniqueName: \"kubernetes.io/projected/da40fd61-e4f1-4780-bf28-5dd931e1a265-kube-api-access-hjqdh\") pod \"node-resolver-htk97\" (UID: \"da40fd61-e4f1-4780-bf28-5dd931e1a265\") " pod="openshift-dns/node-resolver-htk97" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.796592 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.804563 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 05:29:09 crc kubenswrapper[4956]: W0930 05:29:09.808521 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-966ce03fa4f7ec6b85211bfeee22d542a6c359c037e5408f84eaba43ff2c4ba5 WatchSource:0}: Error finding container 966ce03fa4f7ec6b85211bfeee22d542a6c359c037e5408f84eaba43ff2c4ba5: Status 404 returned error can't find the container with id 966ce03fa4f7ec6b85211bfeee22d542a6c359c037e5408f84eaba43ff2c4ba5 Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.810003 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 05:29:09 crc kubenswrapper[4956]: W0930 05:29:09.816927 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b2d555ce55da8edf0570ec27985096f3741c9683edb38630083c754120f8f45a WatchSource:0}: Error finding container b2d555ce55da8edf0570ec27985096f3741c9683edb38630083c754120f8f45a: Status 404 returned error can't find the container with id b2d555ce55da8edf0570ec27985096f3741c9683edb38630083c754120f8f45a Sep 30 05:29:09 crc kubenswrapper[4956]: W0930 05:29:09.837690 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-206f993ddc4c592afd8d1af809fa70a1b3a0a652c638969b4227f3af00d58dfd WatchSource:0}: Error finding container 206f993ddc4c592afd8d1af809fa70a1b3a0a652c638969b4227f3af00d58dfd: Status 404 returned error can't find the container with id 206f993ddc4c592afd8d1af809fa70a1b3a0a652c638969b4227f3af00d58dfd Sep 30 05:29:09 crc kubenswrapper[4956]: I0930 05:29:09.873519 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-htk97" Sep 30 05:29:09 crc kubenswrapper[4956]: W0930 05:29:09.885242 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda40fd61_e4f1_4780_bf28_5dd931e1a265.slice/crio-16b7e9a624cc59d8d0bde4230dfc1f4a9626c258dd3d7a42ae1ba0f8c24ee67b WatchSource:0}: Error finding container 16b7e9a624cc59d8d0bde4230dfc1f4a9626c258dd3d7a42ae1ba0f8c24ee67b: Status 404 returned error can't find the container with id 16b7e9a624cc59d8d0bde4230dfc1f4a9626c258dd3d7a42ae1ba0f8c24ee67b Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.111542 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.111602 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.111627 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.111647 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.111666 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111748 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:29:11.111719049 +0000 UTC m=+21.438839564 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111775 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111792 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111802 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111844 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:11.111830594 +0000 UTC m=+21.438951119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111850 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111871 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111895 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:11.111885825 +0000 UTC m=+21.439006350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111919 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:11.111909156 +0000 UTC m=+21.439029821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.111978 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.112009 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.112020 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.112075 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:11.112055731 +0000 UTC m=+21.439176256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.340384 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.340384 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.340526 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.340395 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.340629 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:10 crc kubenswrapper[4956]: E0930 05:29:10.340660 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.343778 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.344297 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.345102 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.345777 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.346373 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.346872 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.347522 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.348050 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.348653 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.349164 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.349673 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.352275 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.353103 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.353781 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.354364 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.354884 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.355499 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.355942 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.356595 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.357309 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.357931 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.358617 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.359295 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.360089 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.360728 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.362449 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.363855 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.364535 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.364982 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.365592 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.366021 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.366484 4956 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.366583 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.369629 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.370597 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.371264 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.371344 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.373788 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.374586 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.375232 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.375865 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.376577 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.378285 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.378940 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.379921 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.380390 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.380518 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.381339 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.381861 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.382750 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.383540 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.384396 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.384820 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.385600 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.386227 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.386767 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.387631 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.388818 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.397956 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.404739 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.414611 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.423822 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.463492 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4"} Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.463547 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"966ce03fa4f7ec6b85211bfeee22d542a6c359c037e5408f84eaba43ff2c4ba5"} Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.464909 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-htk97" event={"ID":"da40fd61-e4f1-4780-bf28-5dd931e1a265","Type":"ContainerStarted","Data":"ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100"} Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.464934 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-htk97" event={"ID":"da40fd61-e4f1-4780-bf28-5dd931e1a265","Type":"ContainerStarted","Data":"16b7e9a624cc59d8d0bde4230dfc1f4a9626c258dd3d7a42ae1ba0f8c24ee67b"} Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.468046 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62"} Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.468101 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9"} Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.468131 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"206f993ddc4c592afd8d1af809fa70a1b3a0a652c638969b4227f3af00d58dfd"} Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.468891 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b2d555ce55da8edf0570ec27985096f3741c9683edb38630083c754120f8f45a"} Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.476428 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.488662 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.503372 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.514313 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.539803 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.555226 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.580670 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.609487 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.635570 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.654409 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.662952 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.674051 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.686777 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.699213 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.710319 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.723023 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.815692 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hx8cm"] Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.816149 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-lpcwf"] Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.816172 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.817281 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-frfx9"] Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.817512 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.817674 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.819378 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.819871 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.819887 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.820920 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.820945 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.820945 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.821338 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.822004 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.822214 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.822288 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.822317 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.824479 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.836266 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.862002 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.873226 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.887004 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.900284 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.911611 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.924072 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.935109 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937028 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ecd015b-e216-40d8-ae78-711b2a65c193-proxy-tls\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937061 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-system-cni-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937087 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-socket-dir-parent\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937129 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-netns\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937197 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937266 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-cni-bin\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937308 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-cnibin\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937347 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-system-cni-dir\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937388 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-os-release\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937417 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5ecd015b-e216-40d8-ae78-711b2a65c193-rootfs\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937435 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-etc-kubernetes\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937452 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z98c5\" (UniqueName: \"kubernetes.io/projected/f38dd558-4728-4f7d-b69c-a523b09af345-kube-api-access-z98c5\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937472 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-cnibin\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937488 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-conf-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937547 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-daemon-config\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937647 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-k8s-cni-cncf-io\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937697 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwjr\" (UniqueName: \"kubernetes.io/projected/72ad9902-843c-4117-9ac1-c34d525c9d55-kube-api-access-pvwjr\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937735 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-cni-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937759 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-hostroot\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937786 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f38dd558-4728-4f7d-b69c-a523b09af345-cni-binary-copy\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937810 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-kubelet\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937832 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ecd015b-e216-40d8-ae78-711b2a65c193-mcd-auth-proxy-config\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937857 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f38dd558-4728-4f7d-b69c-a523b09af345-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937896 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjdq\" (UniqueName: \"kubernetes.io/projected/5ecd015b-e216-40d8-ae78-711b2a65c193-kube-api-access-8sjdq\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937923 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-os-release\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937941 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72ad9902-843c-4117-9ac1-c34d525c9d55-cni-binary-copy\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937977 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-cni-multus\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.937991 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-multus-certs\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.954407 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.970530 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:10 crc kubenswrapper[4956]: I0930 05:29:10.983959 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.001892 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.015984 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.026988 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039490 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-cnibin\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039547 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-conf-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039573 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-k8s-cni-cncf-io\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039593 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-daemon-config\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039610 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-cni-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039666 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-cnibin\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039725 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-k8s-cni-cncf-io\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039748 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-hostroot\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039695 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-hostroot\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039834 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwjr\" (UniqueName: \"kubernetes.io/projected/72ad9902-843c-4117-9ac1-c34d525c9d55-kube-api-access-pvwjr\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039865 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f38dd558-4728-4f7d-b69c-a523b09af345-cni-binary-copy\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039888 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-cni-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039888 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ecd015b-e216-40d8-ae78-711b2a65c193-mcd-auth-proxy-config\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039934 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-kubelet\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039978 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f38dd558-4728-4f7d-b69c-a523b09af345-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.039999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjdq\" (UniqueName: \"kubernetes.io/projected/5ecd015b-e216-40d8-ae78-711b2a65c193-kube-api-access-8sjdq\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040014 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-os-release\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040028 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72ad9902-843c-4117-9ac1-c34d525c9d55-cni-binary-copy\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040045 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-multus-certs\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040060 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-cni-multus\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040079 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040093 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ecd015b-e216-40d8-ae78-711b2a65c193-proxy-tls\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040108 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-system-cni-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040143 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-socket-dir-parent\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-netns\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040178 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-cni-bin\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040207 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-system-cni-dir\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040223 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-cnibin\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040262 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-os-release\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040289 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z98c5\" (UniqueName: \"kubernetes.io/projected/f38dd558-4728-4f7d-b69c-a523b09af345-kube-api-access-z98c5\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040307 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5ecd015b-e216-40d8-ae78-711b2a65c193-rootfs\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040322 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-etc-kubernetes\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040380 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-etc-kubernetes\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040402 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-kubelet\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040625 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ecd015b-e216-40d8-ae78-711b2a65c193-mcd-auth-proxy-config\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040672 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-conf-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.040951 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-system-cni-dir\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041055 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f38dd558-4728-4f7d-b69c-a523b09af345-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041096 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-multus-certs\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041257 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-cni-multus\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041493 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f38dd558-4728-4f7d-b69c-a523b09af345-cni-binary-copy\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041531 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-os-release\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041571 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-run-netns\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041590 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-socket-dir-parent\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041597 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72ad9902-843c-4117-9ac1-c34d525c9d55-host-var-lib-cni-bin\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041616 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-system-cni-dir\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041645 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-cnibin\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041691 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-os-release\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041811 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.041952 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5ecd015b-e216-40d8-ae78-711b2a65c193-rootfs\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.043522 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f38dd558-4728-4f7d-b69c-a523b09af345-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.050193 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ecd015b-e216-40d8-ae78-711b2a65c193-proxy-tls\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.053082 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72ad9902-843c-4117-9ac1-c34d525c9d55-cni-binary-copy\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.053148 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72ad9902-843c-4117-9ac1-c34d525c9d55-multus-daemon-config\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.062601 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjdq\" (UniqueName: \"kubernetes.io/projected/5ecd015b-e216-40d8-ae78-711b2a65c193-kube-api-access-8sjdq\") pod \"machine-config-daemon-hx8cm\" (UID: \"5ecd015b-e216-40d8-ae78-711b2a65c193\") " pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.065062 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z98c5\" (UniqueName: \"kubernetes.io/projected/f38dd558-4728-4f7d-b69c-a523b09af345-kube-api-access-z98c5\") pod \"multus-additional-cni-plugins-lpcwf\" (UID: \"f38dd558-4728-4f7d-b69c-a523b09af345\") " pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.068685 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.071373 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwjr\" (UniqueName: \"kubernetes.io/projected/72ad9902-843c-4117-9ac1-c34d525c9d55-kube-api-access-pvwjr\") pod \"multus-frfx9\" (UID: \"72ad9902-843c-4117-9ac1-c34d525c9d55\") " pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.088870 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.105405 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.118707 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.134295 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.138431 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.141403 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.141485 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.141511 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141535 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:29:13.141510925 +0000 UTC m=+23.468631450 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.141573 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.141616 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141620 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141662 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141676 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141677 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141688 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141691 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141689 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141729 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:13.141722411 +0000 UTC m=+23.468842936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141745 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:13.141738132 +0000 UTC m=+23.468858657 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.141784 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:13.141759673 +0000 UTC m=+23.468880278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.142073 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:11 crc kubenswrapper[4956]: E0930 05:29:11.142163 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:13.142147916 +0000 UTC m=+23.469268441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.145396 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.156003 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-frfx9" Sep 30 05:29:11 crc kubenswrapper[4956]: W0930 05:29:11.172320 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38dd558_4728_4f7d_b69c_a523b09af345.slice/crio-e8567518eb0eda5d67b239097ce5b2435f45723b6fb83b68598875fc1fd54608 WatchSource:0}: Error finding container e8567518eb0eda5d67b239097ce5b2435f45723b6fb83b68598875fc1fd54608: Status 404 returned error can't find the container with id e8567518eb0eda5d67b239097ce5b2435f45723b6fb83b68598875fc1fd54608 Sep 30 05:29:11 crc kubenswrapper[4956]: W0930 05:29:11.183846 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72ad9902_843c_4117_9ac1_c34d525c9d55.slice/crio-c3f5286f203b62aa620f4f720f23909d466916fd46eb4e4b9d90b7863ea2cd28 WatchSource:0}: Error finding container c3f5286f203b62aa620f4f720f23909d466916fd46eb4e4b9d90b7863ea2cd28: Status 404 returned error can't find the container with id c3f5286f203b62aa620f4f720f23909d466916fd46eb4e4b9d90b7863ea2cd28 Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.223512 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j8sw2"] Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.224221 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.226360 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.226631 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.226726 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.226824 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.229964 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.230506 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.230670 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.250280 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.255661 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.283285 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.289460 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.297612 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.318256 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.333367 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.344067 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345359 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-kubelet\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345467 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-systemd-units\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345540 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-log-socket\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345604 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xxz\" (UniqueName: \"kubernetes.io/projected/29df1c73-1262-4143-b710-bc690edc2ab8-kube-api-access-p5xxz\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345686 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-ovn\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345754 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-var-lib-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345822 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345887 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.345958 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-script-lib\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346034 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346123 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-env-overrides\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346204 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-netd\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346269 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-bin\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346344 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-config\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346425 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-netns\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346496 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-node-log\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346585 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29df1c73-1262-4143-b710-bc690edc2ab8-ovn-node-metrics-cert\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346654 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-etc-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346734 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-slash\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.346810 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-systemd\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.355212 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.367093 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.386056 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.397742 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.412755 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.424586 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.439009 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447452 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-netns\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447490 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-etc-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447506 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-node-log\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447522 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29df1c73-1262-4143-b710-bc690edc2ab8-ovn-node-metrics-cert\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447546 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-slash\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447559 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-systemd\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447577 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-kubelet\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447598 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-systemd-units\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447613 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-log-socket\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447628 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xxz\" (UniqueName: \"kubernetes.io/projected/29df1c73-1262-4143-b710-bc690edc2ab8-kube-api-access-p5xxz\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447644 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-ovn\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447659 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-var-lib-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447674 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447690 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447707 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-script-lib\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447721 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447736 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-env-overrides\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447759 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-netd\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447775 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-bin\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.447789 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-config\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448358 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-config\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448405 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-netns\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448430 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-etc-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448450 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-node-log\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448845 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-var-lib-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448881 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-systemd-units\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448931 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-kubelet\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448946 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-netd\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448955 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-slash\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448997 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-bin\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.448995 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-openvswitch\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.449070 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.449102 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.449203 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-ovn\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.449240 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-log-socket\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.449281 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-env-overrides\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.449361 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-systemd\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.449703 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-script-lib\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.452175 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29df1c73-1262-4143-b710-bc690edc2ab8-ovn-node-metrics-cert\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.457843 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.471155 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xxz\" (UniqueName: \"kubernetes.io/projected/29df1c73-1262-4143-b710-bc690edc2ab8-kube-api-access-p5xxz\") pod \"ovnkube-node-j8sw2\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.472423 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627"} Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.472462 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef"} Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.472475 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"d1c2d0096377cf4f55f6806c2048a6d31e5462956b3d6176b9dbb5cdade1858a"} Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.473718 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-frfx9" event={"ID":"72ad9902-843c-4117-9ac1-c34d525c9d55","Type":"ContainerStarted","Data":"7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b"} Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.473840 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-frfx9" event={"ID":"72ad9902-843c-4117-9ac1-c34d525c9d55","Type":"ContainerStarted","Data":"c3f5286f203b62aa620f4f720f23909d466916fd46eb4e4b9d90b7863ea2cd28"} Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.474107 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.474827 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerStarted","Data":"172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4"} Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.474877 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerStarted","Data":"e8567518eb0eda5d67b239097ce5b2435f45723b6fb83b68598875fc1fd54608"} Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.487435 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.504672 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.515844 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.539163 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.541055 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: W0930 05:29:11.552251 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29df1c73_1262_4143_b710_bc690edc2ab8.slice/crio-381f02b5a6d726d256e21bc3cf089c46017fbf7dcfd333bb93bdb6b36b240074 WatchSource:0}: Error finding container 381f02b5a6d726d256e21bc3cf089c46017fbf7dcfd333bb93bdb6b36b240074: Status 404 returned error can't find the container with id 381f02b5a6d726d256e21bc3cf089c46017fbf7dcfd333bb93bdb6b36b240074 Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.573301 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.586012 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.597411 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.611629 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.627089 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.639751 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.651470 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.666858 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.680957 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.711735 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.751887 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.792617 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.831009 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.832335 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.834938 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.858190 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.895448 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.946051 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:11 crc kubenswrapper[4956]: I0930 05:29:11.971975 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.017331 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.052423 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.095062 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.138030 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.170631 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.211050 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.249413 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.291255 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.326017 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-xlssx"] Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.326405 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.334998 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.340209 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:12 crc kubenswrapper[4956]: E0930 05:29:12.340302 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.340213 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.340426 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:12 crc kubenswrapper[4956]: E0930 05:29:12.340602 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:12 crc kubenswrapper[4956]: E0930 05:29:12.340729 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.341481 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.364599 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.384327 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.402209 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.461934 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3be91b1-5806-4319-b8e2-71d37a81bc69-host\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.462039 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjcqm\" (UniqueName: \"kubernetes.io/projected/d3be91b1-5806-4319-b8e2-71d37a81bc69-kube-api-access-qjcqm\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.462108 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3be91b1-5806-4319-b8e2-71d37a81bc69-serviceca\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.479276 4956 generic.go:334] "Generic (PLEG): container finished" podID="f38dd558-4728-4f7d-b69c-a523b09af345" containerID="172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4" exitCode=0 Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.479453 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerDied","Data":"172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4"} Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.482772 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f" exitCode=0 Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.482839 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.482877 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"381f02b5a6d726d256e21bc3cf089c46017fbf7dcfd333bb93bdb6b36b240074"} Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.484650 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.484857 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192"} Sep 30 05:29:12 crc kubenswrapper[4956]: E0930 05:29:12.495319 4956 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.519017 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.552776 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.563243 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjcqm\" (UniqueName: \"kubernetes.io/projected/d3be91b1-5806-4319-b8e2-71d37a81bc69-kube-api-access-qjcqm\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.563304 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3be91b1-5806-4319-b8e2-71d37a81bc69-serviceca\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.563345 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3be91b1-5806-4319-b8e2-71d37a81bc69-host\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.563746 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3be91b1-5806-4319-b8e2-71d37a81bc69-host\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.564316 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3be91b1-5806-4319-b8e2-71d37a81bc69-serviceca\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.608393 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjcqm\" (UniqueName: \"kubernetes.io/projected/d3be91b1-5806-4319-b8e2-71d37a81bc69-kube-api-access-qjcqm\") pod \"node-ca-xlssx\" (UID: \"d3be91b1-5806-4319-b8e2-71d37a81bc69\") " pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.611986 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.654519 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.698334 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.729359 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.772016 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.818628 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.862616 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.885994 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xlssx" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.891592 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.939213 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:12 crc kubenswrapper[4956]: I0930 05:29:12.972294 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.010595 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.051593 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.090380 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.134164 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.167889 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.167991 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.168024 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168064 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:29:17.168038949 +0000 UTC m=+27.495159474 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168105 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.168128 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.168184 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168204 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:17.168187224 +0000 UTC m=+27.495307829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168218 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168244 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168255 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168255 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168307 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:17.168291657 +0000 UTC m=+27.495412182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168316 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168330 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168335 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:17.168315978 +0000 UTC m=+27.495436503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168341 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:13 crc kubenswrapper[4956]: E0930 05:29:13.168374 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:17.16836577 +0000 UTC m=+27.495486285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.189793 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.209561 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.251095 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.294716 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.331305 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.370593 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.410671 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.491975 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.492032 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.492051 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.492067 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.492082 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.492096 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.493568 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xlssx" event={"ID":"d3be91b1-5806-4319-b8e2-71d37a81bc69","Type":"ContainerStarted","Data":"8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.493608 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xlssx" event={"ID":"d3be91b1-5806-4319-b8e2-71d37a81bc69","Type":"ContainerStarted","Data":"bd41ed3f52514e6fdc2355c8bd748d675f05059521dec824fc90b02f11103dd9"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.497142 4956 generic.go:334] "Generic (PLEG): container finished" podID="f38dd558-4728-4f7d-b69c-a523b09af345" containerID="be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517" exitCode=0 Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.497696 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerDied","Data":"be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517"} Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.518190 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.537383 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.550209 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.577187 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.610477 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.650175 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.690711 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.729664 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.771321 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.812497 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.851547 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.889984 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.929758 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:13 crc kubenswrapper[4956]: I0930 05:29:13.971296 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.011491 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.058456 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.089728 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.135015 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.176640 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.208557 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.251380 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.291861 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.331537 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.340834 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.340877 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.340833 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:14 crc kubenswrapper[4956]: E0930 05:29:14.340950 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:14 crc kubenswrapper[4956]: E0930 05:29:14.341094 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:14 crc kubenswrapper[4956]: E0930 05:29:14.341210 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.371532 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.424428 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.447390 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.489042 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.502405 4956 generic.go:334] "Generic (PLEG): container finished" podID="f38dd558-4728-4f7d-b69c-a523b09af345" containerID="f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d" exitCode=0 Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.502457 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerDied","Data":"f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d"} Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.531648 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.571552 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.611246 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.654090 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.690265 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.736682 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.778766 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.808509 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.849797 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.894610 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.930848 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:14 crc kubenswrapper[4956]: I0930 05:29:14.970630 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.009598 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.051530 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.094025 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.133759 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.170330 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.258933 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.507678 4956 generic.go:334] "Generic (PLEG): container finished" podID="f38dd558-4728-4f7d-b69c-a523b09af345" containerID="4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873" exitCode=0 Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.507746 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerDied","Data":"4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873"} Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.515903 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.522471 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.539589 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.550978 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.559497 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.567602 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.579963 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.590742 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.602823 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.619874 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.630569 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.648937 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.694160 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.729806 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.769998 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.798237 4956 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.799826 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.799859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.799870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.799928 4956 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.809155 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.843201 4956 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.843438 4956 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.844481 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.844515 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.844525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.844545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.844556 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:15Z","lastTransitionTime":"2025-09-30T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:15 crc kubenswrapper[4956]: E0930 05:29:15.855384 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.859219 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.859248 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.859259 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.859272 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.859282 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:15Z","lastTransitionTime":"2025-09-30T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:15 crc kubenswrapper[4956]: E0930 05:29:15.869671 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.873008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.873036 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.873046 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.873059 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.873069 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:15Z","lastTransitionTime":"2025-09-30T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:15 crc kubenswrapper[4956]: E0930 05:29:15.883466 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.886266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.886290 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.886299 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.886312 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.886322 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:15Z","lastTransitionTime":"2025-09-30T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:15 crc kubenswrapper[4956]: E0930 05:29:15.896390 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.898851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.898890 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.898900 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.898915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.898924 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:15Z","lastTransitionTime":"2025-09-30T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:15 crc kubenswrapper[4956]: E0930 05:29:15.908510 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:15Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:15 crc kubenswrapper[4956]: E0930 05:29:15.908666 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.909959 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.910016 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.910031 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.910046 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:15 crc kubenswrapper[4956]: I0930 05:29:15.910056 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:15Z","lastTransitionTime":"2025-09-30T05:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.013085 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.013130 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.013138 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.013152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.013182 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.117629 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.117662 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.117670 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.117685 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.117694 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.219630 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.219653 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.219662 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.219673 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.219681 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.322441 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.322474 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.322484 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.322497 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.322507 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.341275 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:16 crc kubenswrapper[4956]: E0930 05:29:16.341399 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.341448 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.341489 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:16 crc kubenswrapper[4956]: E0930 05:29:16.341564 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:16 crc kubenswrapper[4956]: E0930 05:29:16.341702 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.424991 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.425269 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.425280 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.425296 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.425306 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.523163 4956 generic.go:334] "Generic (PLEG): container finished" podID="f38dd558-4728-4f7d-b69c-a523b09af345" containerID="efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e" exitCode=0 Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.523201 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerDied","Data":"efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.527486 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.527510 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.527517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.527529 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.527540 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.538142 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.553632 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.572473 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.585218 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.598433 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.616406 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.634978 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.635014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.635031 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.635052 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.635064 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.638623 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.649789 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.660022 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.669259 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.680395 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.691327 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.700224 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.711634 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.728204 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:16Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.737175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.737218 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.737229 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.737254 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.737265 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.839727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.839763 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.839775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.839791 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.839802 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.943757 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.943849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.943868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.943894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:16 crc kubenswrapper[4956]: I0930 05:29:16.943913 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:16Z","lastTransitionTime":"2025-09-30T05:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.046266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.046317 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.046333 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.046355 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.046370 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.149620 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.149680 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.149698 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.149725 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.149745 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.213170 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.213333 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.213379 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.213424 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.213477 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.213646 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.213673 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.213692 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.213756 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:25.213734329 +0000 UTC m=+35.540854884 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214206 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214251 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214285 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214310 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:29:25.214271988 +0000 UTC m=+35.541392553 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214314 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214357 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:25.2143408 +0000 UTC m=+35.541461355 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214382 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:25.214370671 +0000 UTC m=+35.541491236 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214380 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:17 crc kubenswrapper[4956]: E0930 05:29:17.214475 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:25.214452574 +0000 UTC m=+35.541573139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.252969 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.253024 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.253041 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.253067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.253084 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.357032 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.357084 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.357102 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.357165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.357184 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.460680 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.460739 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.460757 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.460781 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.460799 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.536739 4956 generic.go:334] "Generic (PLEG): container finished" podID="f38dd558-4728-4f7d-b69c-a523b09af345" containerID="99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662" exitCode=0 Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.536816 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerDied","Data":"99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.553033 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.563440 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.563500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.563512 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.563529 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.563541 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.569242 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.582562 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.591995 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.600706 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.612562 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.628411 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.644446 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.665337 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.666748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.666812 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.666827 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.666843 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.666875 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.680175 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.696059 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.719526 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.730355 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.747833 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.759735 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:17Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.768482 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.768530 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.768546 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.768570 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.768585 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.870442 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.870488 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.870498 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.870517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.870528 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.972897 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.972930 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.972939 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.972954 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:17 crc kubenswrapper[4956]: I0930 05:29:17.972965 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:17Z","lastTransitionTime":"2025-09-30T05:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.075726 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.075764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.075772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.075785 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.075794 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.179394 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.179481 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.179501 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.179529 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.179548 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.282406 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.282475 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.282501 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.282533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.282618 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.340485 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:18 crc kubenswrapper[4956]: E0930 05:29:18.340704 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.340519 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:18 crc kubenswrapper[4956]: E0930 05:29:18.340915 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.340509 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:18 crc kubenswrapper[4956]: E0930 05:29:18.341064 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.385572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.385649 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.385660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.385683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.385695 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.489014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.489050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.489059 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.489073 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.489083 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.546825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.547286 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.554196 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" event={"ID":"f38dd558-4728-4f7d-b69c-a523b09af345","Type":"ContainerStarted","Data":"a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.563051 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.577941 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.589951 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.591330 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.591405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.591431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.591464 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.591506 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.602025 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.613317 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.627019 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.627371 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.645300 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.663034 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.676956 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.692802 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.694211 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.694255 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.694267 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.694284 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.694296 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.715946 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.734273 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.759586 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.772434 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.784932 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.797269 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.797311 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.797322 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.797337 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.797349 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.802421 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.814329 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.825276 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.842476 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.852722 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.863897 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.872549 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.883161 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.896292 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.900489 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.900526 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.900537 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.900552 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.900563 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:18Z","lastTransitionTime":"2025-09-30T05:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.909173 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.922002 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.933619 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.946363 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.952022 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.958396 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.978061 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:18 crc kubenswrapper[4956]: I0930 05:29:18.991324 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.001286 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.002849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.002891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.002902 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.002928 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.002941 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.011816 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.021852 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.035333 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.047787 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.063038 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.080766 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.098342 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.105517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.105556 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.105568 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.105588 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.105600 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.119917 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.142583 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.177702 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.200824 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.208154 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.208194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.208206 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.208222 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.208230 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.212058 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.223483 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.311386 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.311421 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.311431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.311443 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.311453 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.414268 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.414314 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.414326 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.414343 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.414356 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.517600 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.517651 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.517666 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.517686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.517703 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.557950 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.558585 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.583037 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.599827 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.616273 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.620363 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.620421 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.620438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.620461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.620479 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.637032 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.662805 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.686606 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.712033 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.722374 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.722442 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.722456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.722477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.722491 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.738767 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.761474 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.807869 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.825043 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.825087 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.825095 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.825122 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.825135 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.839197 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.851741 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.863815 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.875416 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.885679 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.897088 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:19Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.927532 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.927580 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.927594 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.927611 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:19 crc kubenswrapper[4956]: I0930 05:29:19.927625 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:19Z","lastTransitionTime":"2025-09-30T05:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.029356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.029385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.029393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.029405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.029414 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.132403 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.132450 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.132461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.132480 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.132492 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.234985 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.235018 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.235026 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.235041 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.235050 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.336960 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.336996 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.337004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.337019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.337028 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.340263 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.340279 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:20 crc kubenswrapper[4956]: E0930 05:29:20.340355 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.340500 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:20 crc kubenswrapper[4956]: E0930 05:29:20.340617 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:20 crc kubenswrapper[4956]: E0930 05:29:20.340701 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.354456 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.365702 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.376207 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.387047 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.400000 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.410890 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.419648 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.428264 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.438971 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.439021 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.439036 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.439054 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.439079 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.444889 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.467159 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.481548 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.501789 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.516079 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.538576 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.543980 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.544009 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.544020 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.544033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.544043 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.561951 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.562700 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/0.log" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.565388 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949" exitCode=1 Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.565430 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.566679 4956 scope.go:117] "RemoveContainer" containerID="91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.577383 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.592067 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.608879 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:20Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 05:29:20.329422 6276 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 05:29:20.329616 6276 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 05:29:20.329891 6276 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 05:29:20.329894 6276 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 05:29:20.329967 6276 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 05:29:20.330049 6276 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 05:29:20.330608 6276 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 05:29:20.330678 6276 factory.go:656] Stopping watch factory\\\\nI0930 05:29:20.330693 6276 ovnkube.go:599] Stopped ovnkube\\\\nI0930 05:29:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.626820 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.638766 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.649994 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.651258 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.651285 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.651298 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.651315 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.651326 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.660013 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.673180 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.684031 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.691956 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.700438 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.711047 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.722096 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.734378 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.746824 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.753302 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.753345 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.753359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.753378 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.753389 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.855767 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.855807 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.855816 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.855830 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.855840 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.958175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.958200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.958207 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.958219 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:20 crc kubenswrapper[4956]: I0930 05:29:20.958227 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:20Z","lastTransitionTime":"2025-09-30T05:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.061920 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.062212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.062297 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.062368 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.062438 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.164211 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.164257 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.164268 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.164284 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.164294 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.266556 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.266782 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.266890 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.266982 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.267054 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.369311 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.369341 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.369349 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.369362 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.369370 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.471821 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.472921 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.473035 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.473147 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.473235 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.569559 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/1.log" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.570490 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/0.log" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.572898 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255" exitCode=1 Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.572942 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.572985 4956 scope.go:117] "RemoveContainer" containerID="91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.573841 4956 scope.go:117] "RemoveContainer" containerID="1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255" Sep 30 05:29:21 crc kubenswrapper[4956]: E0930 05:29:21.574745 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.577309 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.577346 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.577356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.577370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.577379 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.591457 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.606364 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.620147 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.641542 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.655896 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.670268 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.679954 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.679999 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.680017 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.680039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.680055 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.688891 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91088a5a573637e9de7f94ec7fd77370c9d28140d1d04beec78c3d8f6015b949\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:20Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 05:29:20.329422 6276 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 05:29:20.329616 6276 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 05:29:20.329891 6276 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 05:29:20.329894 6276 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 05:29:20.329967 6276 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 05:29:20.330049 6276 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 05:29:20.330608 6276 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 05:29:20.330678 6276 factory.go:656] Stopping watch factory\\\\nI0930 05:29:20.330693 6276 ovnkube.go:599] Stopped ovnkube\\\\nI0930 05:29:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.705333 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.716249 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.730025 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.747401 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.766823 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.783692 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.783743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.783761 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.783785 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.783801 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.787708 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.799846 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.808144 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:21Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.885927 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.885990 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.886008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.886032 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.886049 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.988266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.988331 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.988349 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.988374 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:21 crc kubenswrapper[4956]: I0930 05:29:21.988392 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:21Z","lastTransitionTime":"2025-09-30T05:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.090817 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.090851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.090859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.090891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.090900 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.196101 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.196162 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.196175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.196200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.196434 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.299326 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.299393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.299411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.299434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.299452 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.341052 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:22 crc kubenswrapper[4956]: E0930 05:29:22.341307 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.341890 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:22 crc kubenswrapper[4956]: E0930 05:29:22.342001 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.342181 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:22 crc kubenswrapper[4956]: E0930 05:29:22.342285 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.401818 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.401872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.401889 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.401914 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.401934 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.504577 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.504649 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.504670 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.504697 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.504717 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.578726 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/1.log" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.584511 4956 scope.go:117] "RemoveContainer" containerID="1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255" Sep 30 05:29:22 crc kubenswrapper[4956]: E0930 05:29:22.584828 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.599387 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.607589 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.607659 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.607677 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.607703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.607720 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.616530 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.633822 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.647822 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.660477 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.676663 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.691927 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.693372 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf"] Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.694217 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.696543 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.696758 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.711352 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.711403 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.711416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.711433 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.711446 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.711895 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.740246 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.756285 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.776600 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.800857 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.814048 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.814567 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.814627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.814649 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.814680 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.814704 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.828668 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.841444 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.856835 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.872738 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nf2d\" (UniqueName: \"kubernetes.io/projected/a7d99135-2439-4836-8a69-f4cabc091bb6-kube-api-access-5nf2d\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.872805 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7d99135-2439-4836-8a69-f4cabc091bb6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.872977 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7d99135-2439-4836-8a69-f4cabc091bb6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.873029 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7d99135-2439-4836-8a69-f4cabc091bb6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.877592 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.895708 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.917703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.917749 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.917766 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.917785 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.917800 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:22Z","lastTransitionTime":"2025-09-30T05:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.919868 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.933260 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.947075 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.973532 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7d99135-2439-4836-8a69-f4cabc091bb6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.973598 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7d99135-2439-4836-8a69-f4cabc091bb6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.973646 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nf2d\" (UniqueName: \"kubernetes.io/projected/a7d99135-2439-4836-8a69-f4cabc091bb6-kube-api-access-5nf2d\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.973670 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7d99135-2439-4836-8a69-f4cabc091bb6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.974335 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a7d99135-2439-4836-8a69-f4cabc091bb6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.974796 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a7d99135-2439-4836-8a69-f4cabc091bb6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.979597 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.980802 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a7d99135-2439-4836-8a69-f4cabc091bb6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:22 crc kubenswrapper[4956]: I0930 05:29:22.994236 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:22Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.005300 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nf2d\" (UniqueName: \"kubernetes.io/projected/a7d99135-2439-4836-8a69-f4cabc091bb6-kube-api-access-5nf2d\") pod \"ovnkube-control-plane-749d76644c-x27rf\" (UID: \"a7d99135-2439-4836-8a69-f4cabc091bb6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.011292 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.011643 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.020403 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.020465 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.020484 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.020508 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.020523 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.066288 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: W0930 05:29:23.068842 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d99135_2439_4836_8a69_f4cabc091bb6.slice/crio-7646841e64b3ca3b1525a033059e9f5d4236b481cca9abfd11c99d95f234ac33 WatchSource:0}: Error finding container 7646841e64b3ca3b1525a033059e9f5d4236b481cca9abfd11c99d95f234ac33: Status 404 returned error can't find the container with id 7646841e64b3ca3b1525a033059e9f5d4236b481cca9abfd11c99d95f234ac33 Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.083259 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.097642 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.113209 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.125353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.125387 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.125399 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.125415 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.125427 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.128220 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.139813 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.157266 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.227170 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.227206 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.227215 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.227229 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.227238 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.330217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.330247 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.330256 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.330270 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.330279 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.433465 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.433502 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.433511 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.433527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.433540 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.536318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.536348 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.536356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.536372 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.536380 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.587849 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" event={"ID":"a7d99135-2439-4836-8a69-f4cabc091bb6","Type":"ContainerStarted","Data":"4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.588088 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" event={"ID":"a7d99135-2439-4836-8a69-f4cabc091bb6","Type":"ContainerStarted","Data":"d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.588246 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" event={"ID":"a7d99135-2439-4836-8a69-f4cabc091bb6","Type":"ContainerStarted","Data":"7646841e64b3ca3b1525a033059e9f5d4236b481cca9abfd11c99d95f234ac33"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.606937 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.618074 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.632962 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.638306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.638335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.638346 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.638359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.638371 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.650750 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.661180 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.671514 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.682565 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.693374 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.704691 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.718427 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.729627 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.740817 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.740853 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.740873 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.740891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.740903 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.741470 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.752012 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.765884 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.779724 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.795294 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:23Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.843001 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.843032 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.843041 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.843055 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.843065 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.945405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.945438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.945446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.945459 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:23 crc kubenswrapper[4956]: I0930 05:29:23.945467 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:23Z","lastTransitionTime":"2025-09-30T05:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.048658 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.048712 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.048730 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.048753 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.048769 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.151731 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.151776 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.151788 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.151808 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.151821 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.171815 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ctwgh"] Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.172753 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:24 crc kubenswrapper[4956]: E0930 05:29:24.172849 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.198574 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.218485 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.237058 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.254613 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.254672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.254693 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.254720 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.254740 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.262664 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.280955 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.287455 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkvt\" (UniqueName: \"kubernetes.io/projected/184140db-c30d-4f88-89ff-b7aa2dcca3d1-kube-api-access-sbkvt\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.287544 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.298911 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.327670 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.340498 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.340498 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:24 crc kubenswrapper[4956]: E0930 05:29:24.340721 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.340754 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:24 crc kubenswrapper[4956]: E0930 05:29:24.340879 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:24 crc kubenswrapper[4956]: E0930 05:29:24.341007 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.345502 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.358349 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.358384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.358392 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.358407 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.358416 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.360650 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.378526 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.388577 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.388660 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkvt\" (UniqueName: \"kubernetes.io/projected/184140db-c30d-4f88-89ff-b7aa2dcca3d1-kube-api-access-sbkvt\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:24 crc kubenswrapper[4956]: E0930 05:29:24.388725 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:24 crc kubenswrapper[4956]: E0930 05:29:24.388811 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs podName:184140db-c30d-4f88-89ff-b7aa2dcca3d1 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:24.888786593 +0000 UTC m=+35.215907228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs") pod "network-metrics-daemon-ctwgh" (UID: "184140db-c30d-4f88-89ff-b7aa2dcca3d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.392771 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.404358 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.415165 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkvt\" (UniqueName: \"kubernetes.io/projected/184140db-c30d-4f88-89ff-b7aa2dcca3d1-kube-api-access-sbkvt\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.418843 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.432366 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.442397 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.452603 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.460521 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.460569 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.460584 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.460605 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.460619 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.462407 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:24Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.562670 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.562731 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.562749 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.562775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.562792 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.665022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.665056 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.665067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.665084 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.665095 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.767067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.767131 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.767146 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.767165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.767179 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.869223 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.869260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.869273 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.869290 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.869313 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.893663 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:24 crc kubenswrapper[4956]: E0930 05:29:24.893778 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:24 crc kubenswrapper[4956]: E0930 05:29:24.893821 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs podName:184140db-c30d-4f88-89ff-b7aa2dcca3d1 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:25.893808472 +0000 UTC m=+36.220928997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs") pod "network-metrics-daemon-ctwgh" (UID: "184140db-c30d-4f88-89ff-b7aa2dcca3d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.971764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.972129 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.972144 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.972161 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:24 crc kubenswrapper[4956]: I0930 05:29:24.972171 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:24Z","lastTransitionTime":"2025-09-30T05:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.075196 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.075239 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.075254 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.075274 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.075288 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.178061 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.178144 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.178161 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.178185 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.178202 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.282997 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.283058 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.283072 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.283100 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.283127 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.297713 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.297819 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.297848 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.297871 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.297894 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298004 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:29:41.29794686 +0000 UTC m=+51.625067425 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298025 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298067 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298137 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:41.298107835 +0000 UTC m=+51.625228360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298163 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298208 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298217 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298260 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298286 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298316 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:41.298292171 +0000 UTC m=+51.625412726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298052 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298378 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:41.298350523 +0000 UTC m=+51.625471088 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.298420 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:41.298399975 +0000 UTC m=+51.625520790 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.340440 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.340665 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.387667 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.387715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.387727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.387745 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.387757 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.490161 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.490195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.490203 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.490218 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.490227 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.592808 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.592858 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.592872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.592895 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.592911 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.695266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.695306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.695320 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.695336 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.695347 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.797708 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.797742 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.797752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.797765 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.797774 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.900360 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.900394 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.900404 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.900418 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.900427 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:25Z","lastTransitionTime":"2025-09-30T05:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:25 crc kubenswrapper[4956]: I0930 05:29:25.903744 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.903833 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:25 crc kubenswrapper[4956]: E0930 05:29:25.903873 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs podName:184140db-c30d-4f88-89ff-b7aa2dcca3d1 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:27.903861721 +0000 UTC m=+38.230982246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs") pod "network-metrics-daemon-ctwgh" (UID: "184140db-c30d-4f88-89ff-b7aa2dcca3d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.002089 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.002144 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.002156 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.002172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.002184 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.104347 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.104713 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.104904 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.105088 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.105282 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.117180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.117460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.117641 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.117831 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.118007 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.138408 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:26Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.142004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.142036 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.142058 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.142074 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.142087 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.154474 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:26Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.158503 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.158658 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.158744 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.158873 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.158999 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.174930 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:26Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.178186 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.178224 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.178233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.178247 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.178256 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.190822 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:26Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.195157 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.195212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.195230 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.195279 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.195296 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.209604 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:26Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.209752 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.210891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.210915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.210925 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.210942 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.210955 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.313237 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.313279 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.313290 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.313303 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.313316 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.340585 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.340582 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.340710 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.340777 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.340882 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:26 crc kubenswrapper[4956]: E0930 05:29:26.340943 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.415479 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.415517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.415529 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.415547 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.415556 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.517684 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.517748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.517762 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.517779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.517790 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.620098 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.620152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.620165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.620180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.620189 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.722286 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.722328 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.722338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.722355 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.722365 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.825557 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.825778 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.825848 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.825914 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.825979 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.928250 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.928477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.928562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.928632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:26 crc kubenswrapper[4956]: I0930 05:29:26.928689 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:26Z","lastTransitionTime":"2025-09-30T05:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.030901 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.030958 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.030969 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.030984 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.030994 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.133854 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.134224 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.134412 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.134660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.134917 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.236783 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.237085 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.237312 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.237378 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.237397 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.339896 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.340141 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.340196 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.340212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: E0930 05:29:27.340155 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.340238 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.340257 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.442968 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.443026 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.443044 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.443067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.443087 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.545216 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.545265 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.545276 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.545293 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.545303 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.647939 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.647979 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.647988 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.648002 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.648014 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.750561 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.750862 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.750989 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.751147 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.751268 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.853739 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.853794 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.853834 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.853858 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.853874 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.928097 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:27 crc kubenswrapper[4956]: E0930 05:29:27.928464 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:27 crc kubenswrapper[4956]: E0930 05:29:27.928600 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs podName:184140db-c30d-4f88-89ff-b7aa2dcca3d1 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:31.928570224 +0000 UTC m=+42.255690789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs") pod "network-metrics-daemon-ctwgh" (UID: "184140db-c30d-4f88-89ff-b7aa2dcca3d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.956819 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.956860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.956874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.956889 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:27 crc kubenswrapper[4956]: I0930 05:29:27.956902 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:27Z","lastTransitionTime":"2025-09-30T05:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.059945 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.060004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.060025 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.060050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.060071 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.163338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.163803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.164186 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.164530 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.165296 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.268488 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.268519 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.268527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.268541 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.268549 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.341105 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:28 crc kubenswrapper[4956]: E0930 05:29:28.341290 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.341654 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.341705 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:28 crc kubenswrapper[4956]: E0930 05:29:28.342010 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:28 crc kubenswrapper[4956]: E0930 05:29:28.342223 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.371389 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.371451 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.371481 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.371511 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.371535 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.474346 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.474376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.474387 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.474401 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.474410 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.576861 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.577299 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.577461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.577683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.577816 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.680000 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.680044 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.680057 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.680076 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.680088 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.782413 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.782445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.782454 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.782467 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.782476 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.884361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.884417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.884432 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.884456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.884472 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.986444 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.986482 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.986493 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.986531 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:28 crc kubenswrapper[4956]: I0930 05:29:28.986544 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:28Z","lastTransitionTime":"2025-09-30T05:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.092277 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.092495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.092593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.092688 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.092806 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.195461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.195500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.195512 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.195531 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.195543 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.298917 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.298985 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.299010 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.299039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.299062 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.340483 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:29 crc kubenswrapper[4956]: E0930 05:29:29.340637 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.380034 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.381832 4956 scope.go:117] "RemoveContainer" containerID="1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255" Sep 30 05:29:29 crc kubenswrapper[4956]: E0930 05:29:29.382326 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.402023 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.402068 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.402080 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.402402 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.402438 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.505595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.505672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.505698 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.505721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.505737 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.608269 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.608313 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.608325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.608374 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.608391 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.710403 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.710436 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.710445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.710457 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.710465 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.812733 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.813048 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.813180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.813282 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.813393 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.915721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.915790 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.915809 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.915835 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:29 crc kubenswrapper[4956]: I0930 05:29:29.915854 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:29Z","lastTransitionTime":"2025-09-30T05:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.019198 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.019512 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.019641 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.019764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.019863 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.122156 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.122218 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.122234 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.122258 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.122275 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.224692 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.224756 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.224779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.224809 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.224827 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.327625 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.327661 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.327671 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.327685 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.327695 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.340384 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.340397 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:30 crc kubenswrapper[4956]: E0930 05:29:30.340506 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:30 crc kubenswrapper[4956]: E0930 05:29:30.340597 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.340715 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:30 crc kubenswrapper[4956]: E0930 05:29:30.340842 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.353664 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.369361 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.391095 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.406801 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.430577 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.430659 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.430684 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.430717 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.430734 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.436786 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.453939 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.469106 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.485293 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.502889 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.517249 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.530960 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.533597 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.534226 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.534401 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.534561 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.534748 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.542893 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.555134 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.568075 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.582520 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.598287 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.616335 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.638060 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.638132 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.638149 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.638170 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.638185 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.741417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.741468 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.741482 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.741499 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.741511 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.843636 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.843688 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.843699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.843718 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.843730 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.946781 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.947089 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.947492 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.947718 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:30 crc kubenswrapper[4956]: I0930 05:29:30.947969 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:30Z","lastTransitionTime":"2025-09-30T05:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.051509 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.051576 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.051598 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.051626 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.051648 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.154595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.154708 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.154727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.154751 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.154767 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.256973 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.257017 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.257027 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.257039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.257051 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.340198 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:31 crc kubenswrapper[4956]: E0930 05:29:31.340322 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.359129 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.359164 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.359174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.359194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.359205 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.461689 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.461751 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.461768 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.461796 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.461815 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.564283 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.564343 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.564365 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.564397 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.564421 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.667536 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.667600 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.667617 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.667641 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.667658 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.770810 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.770879 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.770897 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.770924 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.770942 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.873430 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.873485 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.873510 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.873539 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.873564 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.970948 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:31 crc kubenswrapper[4956]: E0930 05:29:31.971168 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:31 crc kubenswrapper[4956]: E0930 05:29:31.971275 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs podName:184140db-c30d-4f88-89ff-b7aa2dcca3d1 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:39.971249134 +0000 UTC m=+50.298369749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs") pod "network-metrics-daemon-ctwgh" (UID: "184140db-c30d-4f88-89ff-b7aa2dcca3d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.976783 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.976883 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.976929 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.976956 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:31 crc kubenswrapper[4956]: I0930 05:29:31.976973 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:31Z","lastTransitionTime":"2025-09-30T05:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.079218 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.079277 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.079293 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.079318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.079338 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.182264 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.182342 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.182354 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.182372 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.182385 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.286188 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.286268 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.286286 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.286684 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.286742 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.340987 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:32 crc kubenswrapper[4956]: E0930 05:29:32.341202 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.341343 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:32 crc kubenswrapper[4956]: E0930 05:29:32.341465 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.341751 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:32 crc kubenswrapper[4956]: E0930 05:29:32.341898 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.389091 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.389226 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.389256 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.389285 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.389305 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.492233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.492590 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.492736 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.492882 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.493020 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.596521 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.596574 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.596592 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.596615 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.596634 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.699384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.699696 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.699757 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.699817 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.699873 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.802107 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.802227 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.802249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.802280 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.802298 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.904451 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.904500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.904510 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.904527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:32 crc kubenswrapper[4956]: I0930 05:29:32.904535 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:32Z","lastTransitionTime":"2025-09-30T05:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.006549 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.006584 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.006595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.006614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.006627 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.109691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.109742 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.109755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.109776 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.109788 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.213429 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.213501 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.213518 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.213541 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.213557 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.316339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.316411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.316435 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.316466 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.316490 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.340825 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:33 crc kubenswrapper[4956]: E0930 05:29:33.341024 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.418537 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.418582 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.418600 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.418624 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.418641 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.521769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.521820 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.521837 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.521859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.521875 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.624143 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.624187 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.624199 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.624218 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.624230 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.726840 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.726922 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.726936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.726951 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.726962 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.829335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.829394 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.829413 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.829446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.829463 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.932804 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.932842 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.932855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.932872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:33 crc kubenswrapper[4956]: I0930 05:29:33.932884 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:33Z","lastTransitionTime":"2025-09-30T05:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.036327 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.036407 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.036435 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.036464 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.036487 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.139488 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.139551 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.139591 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.139626 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.139650 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.242417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.242467 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.242483 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.242505 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.242521 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.340975 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.340999 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.341327 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:34 crc kubenswrapper[4956]: E0930 05:29:34.341232 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:34 crc kubenswrapper[4956]: E0930 05:29:34.341410 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:34 crc kubenswrapper[4956]: E0930 05:29:34.341510 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.345139 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.345203 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.345227 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.345256 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.345277 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.447923 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.447986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.448009 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.448037 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.448073 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.549857 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.549891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.549898 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.549911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.549919 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.652140 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.652189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.652201 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.652220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.652234 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.753953 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.753986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.753994 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.754007 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.754015 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.856605 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.856657 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.856671 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.856696 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.856712 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.959565 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.959605 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.959613 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.959627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:34 crc kubenswrapper[4956]: I0930 05:29:34.959637 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:34Z","lastTransitionTime":"2025-09-30T05:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.062081 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.062199 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.062219 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.062245 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.062261 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.165353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.165418 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.165435 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.165461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.165499 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.268545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.268602 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.268619 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.268642 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.268662 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.340699 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:35 crc kubenswrapper[4956]: E0930 05:29:35.340891 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.371482 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.371533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.371550 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.371572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.371591 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.474801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.474847 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.474863 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.474884 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.474901 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.577875 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.577961 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.577986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.578062 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.578080 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.682318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.682434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.682458 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.682483 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.682502 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.786314 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.786368 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.786377 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.786393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.786402 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.889145 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.889180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.889196 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.889212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.889222 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.992860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.992901 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.992913 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.992928 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:35 crc kubenswrapper[4956]: I0930 05:29:35.992939 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:35Z","lastTransitionTime":"2025-09-30T05:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.095375 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.095443 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.095462 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.095485 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.095502 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.199001 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.199029 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.199037 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.199049 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.199059 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.302021 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.302074 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.302092 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.302146 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.302165 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.341399 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.341418 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.341528 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.341669 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.341799 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.341895 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.405346 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.405394 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.405408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.405427 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.405440 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.421534 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.421585 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.421602 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.421627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.421644 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.440674 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:36Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.445559 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.445618 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.445643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.445671 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.445693 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.465305 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:36Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.469874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.469926 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.469948 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.469978 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.470001 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.490422 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:36Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.495362 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.495390 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.495398 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.495413 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.495422 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.513964 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:36Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.523259 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.523312 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.523369 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.523393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.523409 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.546281 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:36Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:36 crc kubenswrapper[4956]: E0930 05:29:36.546531 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.549052 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.549107 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.549161 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.549191 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.549212 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.651063 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.651097 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.651105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.651136 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.651145 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.753326 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.753366 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.753383 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.753430 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.753443 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.856023 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.856086 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.856111 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.856185 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.856210 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.959456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.959611 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.959631 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.959656 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:36 crc kubenswrapper[4956]: I0930 05:29:36.959673 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:36Z","lastTransitionTime":"2025-09-30T05:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.062934 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.063041 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.063066 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.063196 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.063239 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.166319 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.166392 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.166415 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.166443 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.166464 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.269560 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.269633 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.269656 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.269686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.269711 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.340690 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:37 crc kubenswrapper[4956]: E0930 05:29:37.340947 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.373176 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.373231 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.373249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.373277 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.373297 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.476039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.476106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.476163 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.476192 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.476211 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.578856 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.578934 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.578957 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.578986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.579008 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.682275 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.682375 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.682401 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.682460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.682483 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.785997 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.786089 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.786157 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.786201 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.786224 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.889416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.889493 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.889561 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.889593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.889618 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.993031 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.993097 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.993143 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.993175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:37 crc kubenswrapper[4956]: I0930 05:29:37.993197 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:37Z","lastTransitionTime":"2025-09-30T05:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.095936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.095987 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.096003 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.096025 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.096044 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.198777 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.198871 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.198904 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.198936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.198959 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.302564 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.302622 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.302639 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.302664 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.302682 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.340889 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.341015 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.341256 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:38 crc kubenswrapper[4956]: E0930 05:29:38.341247 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:38 crc kubenswrapper[4956]: E0930 05:29:38.341375 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:38 crc kubenswrapper[4956]: E0930 05:29:38.341540 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.405976 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.406014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.406022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.406034 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.406043 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.508351 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.508447 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.508476 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.508508 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.508530 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.615635 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.615715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.615737 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.615764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.615783 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.719096 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.719206 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.719230 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.719261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.719282 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.822484 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.822551 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.822578 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.822608 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.822730 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.925821 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.925870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.925883 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.925901 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:38 crc kubenswrapper[4956]: I0930 05:29:38.925914 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:38Z","lastTransitionTime":"2025-09-30T05:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.028153 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.028202 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.028213 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.028233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.028245 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.130368 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.130432 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.130456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.130486 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.130510 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.233255 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.233302 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.233317 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.233336 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.233348 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.336202 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.336267 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.336291 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.336320 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.336338 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.340542 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:39 crc kubenswrapper[4956]: E0930 05:29:39.340709 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.439050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.439148 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.439167 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.439189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.439222 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.541071 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.541164 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.541185 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.541218 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.541241 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.643485 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.643546 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.643564 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.643588 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.643604 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.746384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.746446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.746464 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.746488 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.746506 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.849233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.849289 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.849307 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.849332 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.849349 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.952766 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.952829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.952845 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.952868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:39 crc kubenswrapper[4956]: I0930 05:29:39.952885 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:39Z","lastTransitionTime":"2025-09-30T05:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.055505 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.055552 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.055568 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.055591 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.055607 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.057556 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:40 crc kubenswrapper[4956]: E0930 05:29:40.057735 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:40 crc kubenswrapper[4956]: E0930 05:29:40.057847 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs podName:184140db-c30d-4f88-89ff-b7aa2dcca3d1 nodeName:}" failed. No retries permitted until 2025-09-30 05:29:56.057819344 +0000 UTC m=+66.384939899 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs") pod "network-metrics-daemon-ctwgh" (UID: "184140db-c30d-4f88-89ff-b7aa2dcca3d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.159492 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.159560 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.159583 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.159610 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.159631 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.261721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.261780 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.261798 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.261824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.261841 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.340850 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.340910 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:40 crc kubenswrapper[4956]: E0930 05:29:40.341096 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.341144 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:40 crc kubenswrapper[4956]: E0930 05:29:40.341265 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:40 crc kubenswrapper[4956]: E0930 05:29:40.341430 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.359391 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.363527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.363581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.363599 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.363622 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.363643 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.377081 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.395658 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.409926 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.442272 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.460708 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.465067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.465167 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.465211 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.465233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.465247 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.479705 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.493408 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.509741 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.529683 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.544241 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.558780 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.567998 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.568036 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.568052 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.568071 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.568085 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.574235 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.591749 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.608005 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.622681 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.637740 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.671727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.671933 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.672027 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.672172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.672303 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.774889 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.774925 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.774936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.774952 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.774964 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.876660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.876715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.876731 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.876755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.876772 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.980238 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.980294 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.980311 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.980335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:40 crc kubenswrapper[4956]: I0930 05:29:40.980354 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:40Z","lastTransitionTime":"2025-09-30T05:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.082989 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.083174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.083197 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.083221 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.083238 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.185937 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.186330 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.186348 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.186370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.186388 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.289257 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.289289 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.289300 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.289314 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.289325 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.340958 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.341105 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.371619 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:30:13.371584865 +0000 UTC m=+83.698705430 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.371460 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.371885 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.372196 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.372228 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.372289 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.372462 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 05:30:13.372442323 +0000 UTC m=+83.699562878 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.372871 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.372959 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.373021 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.373084 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.373216 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:30:13.373192409 +0000 UTC m=+83.700313034 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.373220 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.373246 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.373295 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.373313 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.373319 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:30:13.373297423 +0000 UTC m=+83.700417958 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:29:41 crc kubenswrapper[4956]: E0930 05:29:41.373364 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 05:30:13.373347724 +0000 UTC m=+83.700468289 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.391435 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.391531 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.391551 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.391575 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.391592 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.495825 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.495881 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.495893 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.495911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.495923 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.599443 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.599495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.599507 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.599525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.599537 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.702660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.702714 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.702727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.702745 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.702759 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.805852 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.805903 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.805914 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.805934 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.805947 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.909022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.909086 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.909104 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.909169 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.909187 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:41Z","lastTransitionTime":"2025-09-30T05:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.912387 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.926273 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.935981 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:41Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.953659 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:41Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:41 crc kubenswrapper[4956]: I0930 05:29:41.976975 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:41Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.012514 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.012576 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.012595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.012620 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.012638 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.014391 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.029817 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.050002 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.079911 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.097346 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.115619 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.115677 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.115696 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.115720 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.115740 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.116880 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.134854 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.149991 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.168302 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.187697 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.205388 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.218211 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.218270 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.218287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.218312 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.218330 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.219343 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.229159 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.241590 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:42Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.321024 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.321287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.321371 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.321453 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.321535 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.341384 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:42 crc kubenswrapper[4956]: E0930 05:29:42.341550 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.341760 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.341821 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:42 crc kubenswrapper[4956]: E0930 05:29:42.341999 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:42 crc kubenswrapper[4956]: E0930 05:29:42.342157 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.424801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.425025 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.425140 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.425214 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.425309 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.529030 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.529295 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.529361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.529422 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.529476 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.633102 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.633189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.633213 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.633243 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.633265 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.736341 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.736411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.736438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.736470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.736487 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.838818 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.839136 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.839281 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.839441 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.839563 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.942305 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.942345 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.942354 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.942369 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:42 crc kubenswrapper[4956]: I0930 05:29:42.942379 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:42Z","lastTransitionTime":"2025-09-30T05:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.045193 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.045238 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.045249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.045267 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.045278 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.147803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.148294 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.148396 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.148482 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.148567 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.251031 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.251094 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.251149 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.251179 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.251202 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.340232 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:43 crc kubenswrapper[4956]: E0930 05:29:43.340449 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.354318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.354387 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.354407 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.354431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.354448 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.456349 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.456434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.456464 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.456496 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.456515 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.558260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.558308 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.558321 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.558338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.558349 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.660078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.660142 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.660156 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.660174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.660186 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.762911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.763245 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.763423 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.763559 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.763693 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.866045 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.866486 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.866709 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.866907 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.867143 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.970496 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.970568 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.970581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.970607 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:43 crc kubenswrapper[4956]: I0930 05:29:43.970619 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:43Z","lastTransitionTime":"2025-09-30T05:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.072540 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.072596 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.072604 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.072620 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.072629 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.175520 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.175569 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.175585 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.175606 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.175620 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.278361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.278449 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.278459 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.278474 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.278484 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.340648 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.340788 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.340900 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:44 crc kubenswrapper[4956]: E0930 05:29:44.340891 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:44 crc kubenswrapper[4956]: E0930 05:29:44.341357 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:44 crc kubenswrapper[4956]: E0930 05:29:44.341548 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.341649 4956 scope.go:117] "RemoveContainer" containerID="1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.380869 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.381067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.381245 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.381373 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.381477 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.484600 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.484903 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.484921 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.484944 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.484962 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.588206 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.588238 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.588246 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.588260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.588271 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.653358 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/1.log" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.656607 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.657032 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.677240 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.691902 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.691948 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.691959 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.691976 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.691987 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.699547 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.722534 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.741825 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.763548 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.783736 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.794339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.794376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.794386 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.794402 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.794413 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.803598 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.817223 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.832964 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.848515 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.862602 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.876036 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.887701 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.896544 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.896584 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.896597 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.896617 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.896630 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.909425 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.918884 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.932976 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.959371 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.971056 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:44Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.998852 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.998888 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.998899 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.998917 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:44 crc kubenswrapper[4956]: I0930 05:29:44.998928 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:44Z","lastTransitionTime":"2025-09-30T05:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.101321 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.101373 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.101386 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.101405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.101417 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.204081 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.204132 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.204143 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.204157 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.204168 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.306761 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.306820 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.306836 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.306861 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.306877 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.340259 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:45 crc kubenswrapper[4956]: E0930 05:29:45.340397 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.409246 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.409294 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.409306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.409326 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.409338 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.512595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.512636 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.512650 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.512668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.512682 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.614547 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.614599 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.614616 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.614640 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.614657 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.662272 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/2.log" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.663139 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/1.log" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.666410 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837" exitCode=1 Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.666458 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.666500 4956 scope.go:117] "RemoveContainer" containerID="1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.667395 4956 scope.go:117] "RemoveContainer" containerID="e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837" Sep 30 05:29:45 crc kubenswrapper[4956]: E0930 05:29:45.667639 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.687988 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.700325 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.716993 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.717073 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.717102 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.717163 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.717185 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.721081 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.742019 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ab89196a720e4a20f84fb94128df8e25a9f9bda5baba957380d4efdb4ce5255\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:21Z\\\",\\\"message\\\":\\\"ws:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 05:29:21.439153 6405 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xlssx in node crc\\\\nI0930 05:29:21.439593 6405 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-xlssx after 0 failed attempt(s)\\\\nI0930 05:29:21.439606 6405 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-xlssx\\\\nF0930 05:29:21.439070 6405 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: curren\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.754514 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.770477 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.787649 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.802130 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.817054 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.820052 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.820088 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.820100 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.820135 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.820147 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.834952 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.850188 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.862535 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.874535 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.886443 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.905996 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.921622 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.922381 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.922410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.922433 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.922446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.922455 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:45Z","lastTransitionTime":"2025-09-30T05:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.936008 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:45 crc kubenswrapper[4956]: I0930 05:29:45.984423 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.024964 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.025018 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.025033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.025054 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.025071 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.128011 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.128092 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.128107 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.128147 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.128159 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.232500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.232556 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.232574 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.232598 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.232616 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.335735 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.335800 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.335824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.335861 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.335883 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.340275 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.340364 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.340407 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.340579 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.340649 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.340894 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.442370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.442438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.442457 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.442482 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.442500 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.545427 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.545500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.545518 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.545542 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.545560 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.648436 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.648502 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.648518 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.648561 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.648578 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.672443 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/2.log" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.677868 4956 scope.go:117] "RemoveContainer" containerID="e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.678154 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.678851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.678891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.678900 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.678914 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.678926 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.698693 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.701283 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.705472 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.705545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.705569 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.705606 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.705629 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.727157 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.733571 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.739697 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.739763 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.739789 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.739818 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.739840 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.747945 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.760032 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.764656 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.765312 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.765372 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.765391 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.765414 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.765433 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.780234 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.785303 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.790528 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.790577 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.790596 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.790619 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.790636 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.799948 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.811212 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: E0930 05:29:46.811678 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.813870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.813939 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.813952 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.813993 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.814006 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.818561 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.832296 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.844370 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.859881 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.872563 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.888921 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.910109 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.920253 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.920305 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.920317 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.920332 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.920663 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:46Z","lastTransitionTime":"2025-09-30T05:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.927628 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.953226 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.970203 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:46 crc kubenswrapper[4956]: I0930 05:29:46.984541 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:46Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.005193 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:47Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.022397 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.022493 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.022636 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.022699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.022754 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.127019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.127278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.127364 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.127468 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.127546 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.230695 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.230930 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.231012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.231094 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.231210 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.334657 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.334706 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.334723 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.334749 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.334767 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.340490 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:47 crc kubenswrapper[4956]: E0930 05:29:47.340765 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.437977 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.438070 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.438091 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.438134 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.438152 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.540502 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.540802 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.540935 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.541024 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.541096 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.643646 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.643685 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.643694 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.643707 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.643719 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.745940 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.746235 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.746325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.746411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.746484 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.848261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.848621 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.848772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.848896 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.849022 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.951943 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.951996 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.952016 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.952039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:47 crc kubenswrapper[4956]: I0930 05:29:47.952058 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:47Z","lastTransitionTime":"2025-09-30T05:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.054313 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.054726 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.054909 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.055089 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.055297 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.158916 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.159359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.159529 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.159702 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.159840 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.262776 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.263199 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.263306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.263409 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.263506 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.340087 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.340088 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.340213 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:48 crc kubenswrapper[4956]: E0930 05:29:48.340349 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:48 crc kubenswrapper[4956]: E0930 05:29:48.340420 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:48 crc kubenswrapper[4956]: E0930 05:29:48.340522 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.365508 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.365561 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.365572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.365594 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.365606 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.468748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.468824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.468847 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.468874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.468895 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.571623 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.571691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.571708 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.571731 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.571749 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.674118 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.674181 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.674194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.674212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.674221 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.776507 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.776546 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.776554 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.776571 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.776580 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.879651 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.879708 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.879724 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.879747 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.879762 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.982363 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.982466 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.982500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.982534 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:48 crc kubenswrapper[4956]: I0930 05:29:48.982556 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:48Z","lastTransitionTime":"2025-09-30T05:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.085564 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.085633 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.085650 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.085673 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.085691 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.189085 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.189199 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.189222 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.189249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.189270 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.292765 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.292818 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.292836 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.292860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.292877 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.340314 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:49 crc kubenswrapper[4956]: E0930 05:29:49.340530 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.394718 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.394762 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.394774 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.394790 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.394802 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.497410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.497462 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.497477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.497498 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.497515 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.600525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.600574 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.600589 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.600610 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.600625 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.703926 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.703995 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.704020 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.704052 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.704075 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.806961 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.807057 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.807076 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.807098 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.807145 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.909562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.909603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.909612 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.909628 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:49 crc kubenswrapper[4956]: I0930 05:29:49.909639 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:49Z","lastTransitionTime":"2025-09-30T05:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.012579 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.012837 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.012929 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.013015 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.013079 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.115745 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.115779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.115788 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.115800 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.115808 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.218048 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.218375 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.218522 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.218665 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.218765 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.321424 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.321495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.321520 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.321548 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.321570 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.339977 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.339998 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:50 crc kubenswrapper[4956]: E0930 05:29:50.340219 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.340270 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:50 crc kubenswrapper[4956]: E0930 05:29:50.340350 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:50 crc kubenswrapper[4956]: E0930 05:29:50.340439 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.357290 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.379535 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.391810 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.405761 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.425051 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.426282 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.426494 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.426651 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.426795 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.427083 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.442692 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.457468 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.470829 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.482771 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.495771 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.509137 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.528498 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.530860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.530881 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.530889 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.530904 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.530913 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.542683 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.555407 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.571114 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.587705 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.605203 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.626398 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:50Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.633253 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.633293 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.633305 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.633321 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.633332 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.735971 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.736023 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.736036 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.736056 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.736068 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.838212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.838262 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.838277 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.838300 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.838316 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.941022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.941067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.941079 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.941096 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:50 crc kubenswrapper[4956]: I0930 05:29:50.941108 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:50Z","lastTransitionTime":"2025-09-30T05:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.043471 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.043524 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.043536 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.043549 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.043559 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.145867 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.145907 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.145916 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.145929 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.145940 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.248712 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.248754 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.248764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.248780 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.248789 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.340162 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:51 crc kubenswrapper[4956]: E0930 05:29:51.340308 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.351536 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.351619 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.351646 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.351678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.351702 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.455214 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.455280 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.455298 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.455325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.455346 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.558079 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.558108 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.558146 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.558165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.558180 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.660901 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.660928 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.660936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.660948 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.660958 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.764544 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.764585 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.764595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.764609 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.764619 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.867459 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.867514 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.867580 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.867603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.867615 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.969721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.969803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.969824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.969849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:51 crc kubenswrapper[4956]: I0930 05:29:51.969867 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:51Z","lastTransitionTime":"2025-09-30T05:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.073193 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.073285 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.073303 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.073326 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.073342 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.175709 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.175761 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.175772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.175787 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.175798 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.278494 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.278563 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.278582 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.278605 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.278622 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.340241 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.340276 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.340247 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:52 crc kubenswrapper[4956]: E0930 05:29:52.340417 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:52 crc kubenswrapper[4956]: E0930 05:29:52.340487 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:52 crc kubenswrapper[4956]: E0930 05:29:52.340586 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.380690 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.380740 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.380757 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.380779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.380799 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.483275 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.483315 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.483325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.483340 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.483350 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.585881 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.585923 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.585931 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.585947 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.585955 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.688891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.688966 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.688985 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.689012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.689030 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.792009 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.792135 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.792152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.792222 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.792238 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.894674 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.894720 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.894730 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.894747 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.894759 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.998109 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.998210 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.998230 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.998263 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:52 crc kubenswrapper[4956]: I0930 05:29:52.998287 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:52Z","lastTransitionTime":"2025-09-30T05:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.100907 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.100978 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.100999 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.101025 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.101042 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.205170 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.205237 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.205255 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.205278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.205295 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.306908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.306963 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.306976 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.306994 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.307008 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.344486 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:53 crc kubenswrapper[4956]: E0930 05:29:53.344708 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.410179 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.410240 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.410256 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.410281 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.410329 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.513618 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.513745 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.513768 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.513797 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.513820 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.616194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.616240 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.616252 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.616270 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.616284 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.718715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.718766 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.718779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.718832 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.718846 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.821755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.821817 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.821834 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.821856 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.821890 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.925499 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.925593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.925617 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.925652 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:53 crc kubenswrapper[4956]: I0930 05:29:53.925675 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:53Z","lastTransitionTime":"2025-09-30T05:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.028153 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.028196 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.028206 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.028219 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.028229 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.130105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.130147 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.130156 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.130169 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.130178 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.232983 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.233030 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.233041 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.233057 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.233078 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.335862 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.335903 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.335915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.335932 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.335943 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.340399 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.340446 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:54 crc kubenswrapper[4956]: E0930 05:29:54.340489 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.340399 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:54 crc kubenswrapper[4956]: E0930 05:29:54.340579 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:54 crc kubenswrapper[4956]: E0930 05:29:54.340719 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.437892 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.437940 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.437953 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.437972 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.437998 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.540975 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.541006 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.541015 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.541027 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.541036 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.642552 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.642598 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.642609 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.642628 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.642639 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.744726 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.744788 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.744811 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.744836 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.744853 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.846959 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.846993 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.847003 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.847018 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.847028 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.948801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.948837 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.948848 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.948861 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:54 crc kubenswrapper[4956]: I0930 05:29:54.948871 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:54Z","lastTransitionTime":"2025-09-30T05:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.051174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.051230 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.051240 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.051253 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.051263 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.153285 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.153328 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.153337 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.153357 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.153367 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.255958 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.255992 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.256004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.256019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.256031 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.340832 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:55 crc kubenswrapper[4956]: E0930 05:29:55.341266 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.358647 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.358689 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.358699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.358717 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.358730 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.461045 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.461080 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.461091 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.461106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.461116 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.563207 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.563244 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.563254 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.563268 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.563279 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.665829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.665869 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.665878 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.665891 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.665899 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.772630 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.772673 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.772686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.772702 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.772715 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.874849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.874875 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.874882 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.874894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.874902 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.979156 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.979192 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.979200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.979216 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:55 crc kubenswrapper[4956]: I0930 05:29:55.979226 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:55Z","lastTransitionTime":"2025-09-30T05:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.081936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.081982 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.081995 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.082008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.082019 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.131741 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:56 crc kubenswrapper[4956]: E0930 05:29:56.131877 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:56 crc kubenswrapper[4956]: E0930 05:29:56.131936 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs podName:184140db-c30d-4f88-89ff-b7aa2dcca3d1 nodeName:}" failed. No retries permitted until 2025-09-30 05:30:28.131918465 +0000 UTC m=+98.459038990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs") pod "network-metrics-daemon-ctwgh" (UID: "184140db-c30d-4f88-89ff-b7aa2dcca3d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.184581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.184629 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.184641 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.184659 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.184671 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.287494 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.287550 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.287559 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.287573 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.287582 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.340689 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.340759 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.340696 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:56 crc kubenswrapper[4956]: E0930 05:29:56.340813 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:56 crc kubenswrapper[4956]: E0930 05:29:56.340878 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:56 crc kubenswrapper[4956]: E0930 05:29:56.341030 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.390951 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.390997 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.391014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.391035 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.391052 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.493382 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.493421 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.493432 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.493445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.493455 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.595722 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.595755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.595763 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.595776 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.595785 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.698058 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.698106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.698138 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.698151 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.698160 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.800735 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.800795 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.800812 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.800834 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.800853 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.904040 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.904080 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.904088 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.904100 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:56 crc kubenswrapper[4956]: I0930 05:29:56.904112 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:56Z","lastTransitionTime":"2025-09-30T05:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.006968 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.007011 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.007022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.007042 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.007055 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.109685 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.109715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.109724 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.109738 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.109747 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.187820 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.187862 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.187875 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.187893 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.187905 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: E0930 05:29:57.206799 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:57Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.210553 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.210576 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.210583 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.210595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.210605 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: E0930 05:29:57.226830 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:57Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.231868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.231886 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.231894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.231906 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.231913 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: E0930 05:29:57.249422 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:57Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.253688 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.253748 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.253774 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.253799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.253816 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: E0930 05:29:57.270882 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:57Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.274011 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.274041 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.274051 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.274066 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.274076 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: E0930 05:29:57.289386 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:57Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:57 crc kubenswrapper[4956]: E0930 05:29:57.289923 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.292523 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.292568 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.292586 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.292609 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.292625 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.340269 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:57 crc kubenswrapper[4956]: E0930 05:29:57.340441 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.394953 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.394994 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.395002 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.395014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.395026 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.497465 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.497504 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.497512 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.497528 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.497537 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.599149 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.599203 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.599247 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.599262 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.599272 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.702402 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.702444 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.702456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.702476 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.702488 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.804695 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.804722 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.804730 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.804743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.804751 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.907369 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.907405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.907414 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.907431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:57 crc kubenswrapper[4956]: I0930 05:29:57.907441 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:57Z","lastTransitionTime":"2025-09-30T05:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.009848 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.009894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.009905 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.009920 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.009934 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.112352 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.112384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.112392 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.112405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.112418 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.214571 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.214611 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.214624 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.214639 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.214651 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.316823 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.316893 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.316905 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.316922 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.316936 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.340194 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.340288 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.340480 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:29:58 crc kubenswrapper[4956]: E0930 05:29:58.340565 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:29:58 crc kubenswrapper[4956]: E0930 05:29:58.340611 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:29:58 crc kubenswrapper[4956]: E0930 05:29:58.340679 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.340897 4956 scope.go:117] "RemoveContainer" containerID="e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837" Sep 30 05:29:58 crc kubenswrapper[4956]: E0930 05:29:58.341202 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.419832 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.419890 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.419900 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.419915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.419924 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.522753 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.522790 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.522801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.522816 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.522826 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.625403 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.625435 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.625444 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.625456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.625465 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.710478 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/0.log" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.710521 4956 generic.go:334] "Generic (PLEG): container finished" podID="72ad9902-843c-4117-9ac1-c34d525c9d55" containerID="7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b" exitCode=1 Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.710547 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-frfx9" event={"ID":"72ad9902-843c-4117-9ac1-c34d525c9d55","Type":"ContainerDied","Data":"7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.710872 4956 scope.go:117] "RemoveContainer" containerID="7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.725190 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.728863 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.728899 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.728915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.728937 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.728953 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.742182 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.761391 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.775825 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.791322 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.802084 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.816960 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.845668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.845703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.845712 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.845729 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.845739 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.853487 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.875218 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.896433 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.907875 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.948394 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.948423 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.948432 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.948445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.948454 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:58Z","lastTransitionTime":"2025-09-30T05:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.952862 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.962849 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.974935 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:58 crc kubenswrapper[4956]: I0930 05:29:58.999470 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:58Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.016426 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.027312 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.040960 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.050381 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.050419 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.050431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.050445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.050455 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.152469 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.152505 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.152514 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.152528 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.152536 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.254621 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.254653 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.254663 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.254676 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.254685 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.340757 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:29:59 crc kubenswrapper[4956]: E0930 05:29:59.340880 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.357445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.357485 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.357494 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.357509 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.357519 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.460034 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.460072 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.460096 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.460115 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.460139 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.561631 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.561672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.561683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.561700 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.561712 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.663605 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.663657 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.663668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.663683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.663691 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.714817 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/0.log" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.714861 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-frfx9" event={"ID":"72ad9902-843c-4117-9ac1-c34d525c9d55","Type":"ContainerStarted","Data":"13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.734999 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.747931 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.761639 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.765799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.765832 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.765841 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.765874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.765887 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.781302 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.811439 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.834275 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.851564 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.868287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.868321 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.868332 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.868345 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.868355 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.880231 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.894715 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.914447 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.931345 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.952347 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.968069 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.970393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.970536 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.970547 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.970562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.970571 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:29:59Z","lastTransitionTime":"2025-09-30T05:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.980365 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:29:59 crc kubenswrapper[4956]: I0930 05:29:59.992377 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:59Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.001850 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.010780 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.018517 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.073106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.073170 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.073188 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.073207 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.073215 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.175319 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.175353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.175362 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.175376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.175385 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.277731 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.277775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.277788 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.277804 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.277817 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.340630 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:00 crc kubenswrapper[4956]: E0930 05:30:00.340755 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.340764 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.340800 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:00 crc kubenswrapper[4956]: E0930 05:30:00.340985 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:00 crc kubenswrapper[4956]: E0930 05:30:00.341065 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.360352 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.370315 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.380307 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.380338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.380351 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.380366 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.380374 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.387068 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.418045 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.429690 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.442462 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.453302 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.462398 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.473789 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.482512 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.482549 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.482560 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.482573 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.482583 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.485011 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.496385 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.506861 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.516526 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.526556 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.538427 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.549214 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.559866 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.574720 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:00Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.584392 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.584428 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.584438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.584450 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.584460 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.686317 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.686352 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.686361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.686375 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.686383 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.788991 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.789036 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.789051 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.789070 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.789082 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.892149 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.892200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.892210 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.892223 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.892233 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.994342 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.994380 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.994389 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.994404 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:00 crc kubenswrapper[4956]: I0930 05:30:00.994413 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:00Z","lastTransitionTime":"2025-09-30T05:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.097234 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.097274 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.097283 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.097298 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.097307 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.199294 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.199330 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.199339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.199353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.199365 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.302259 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.302329 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.302344 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.302360 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.302373 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.340604 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:01 crc kubenswrapper[4956]: E0930 05:30:01.340760 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.404647 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.404683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.404694 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.404709 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.404719 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.506995 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.507037 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.507047 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.507060 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.507069 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.609130 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.609162 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.609170 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.609186 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.609194 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.711492 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.711520 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.711528 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.711539 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.711548 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.813678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.813716 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.813725 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.813739 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.813747 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.915234 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.915266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.915274 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.915286 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:01 crc kubenswrapper[4956]: I0930 05:30:01.915294 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:01Z","lastTransitionTime":"2025-09-30T05:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.017609 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.017639 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.017647 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.017660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.017668 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.120247 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.120280 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.120288 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.120306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.120315 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.222105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.222178 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.222190 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.222210 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.222221 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.325217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.325249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.325259 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.325272 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.325280 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.340481 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.340516 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:02 crc kubenswrapper[4956]: E0930 05:30:02.340596 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.340611 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:02 crc kubenswrapper[4956]: E0930 05:30:02.340754 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:02 crc kubenswrapper[4956]: E0930 05:30:02.340828 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.427439 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.427507 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.427525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.427550 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.427567 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.529561 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.529617 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.529626 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.529640 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.529650 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.632229 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.632266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.632274 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.632288 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.632299 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.734949 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.734981 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.734991 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.735005 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.735017 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.837711 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.837747 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.837755 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.837773 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.837781 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.939593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.939646 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.939660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.939691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:02 crc kubenswrapper[4956]: I0930 05:30:02.939703 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:02Z","lastTransitionTime":"2025-09-30T05:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.042227 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.042352 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.042376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.042402 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.042419 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.145142 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.145228 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.145243 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.145261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.145375 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.247527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.247594 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.247610 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.247634 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.247646 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.340762 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:03 crc kubenswrapper[4956]: E0930 05:30:03.340956 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.350945 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.351005 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.351027 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.351053 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.351074 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.453252 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.453294 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.453303 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.453318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.453331 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.555978 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.556026 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.556042 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.556062 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.556077 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.658573 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.658632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.658648 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.658669 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.658684 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.761816 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.761872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.761890 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.761913 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.761929 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.863803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.863859 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.863876 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.863898 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.863935 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.966460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.966523 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.966560 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.966593 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:03 crc kubenswrapper[4956]: I0930 05:30:03.966618 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:03Z","lastTransitionTime":"2025-09-30T05:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.069987 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.070029 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.070038 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.070052 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.070062 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.171908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.171934 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.171941 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.171953 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.171962 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.274743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.274805 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.274825 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.274854 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.274877 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.344435 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.344495 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:04 crc kubenswrapper[4956]: E0930 05:30:04.344604 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:04 crc kubenswrapper[4956]: E0930 05:30:04.344714 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.344760 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:04 crc kubenswrapper[4956]: E0930 05:30:04.344836 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.377417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.377506 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.377523 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.377545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.377562 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.480290 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.480345 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.480361 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.480384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.480403 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.583460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.583497 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.583504 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.583517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.583526 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.685451 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.685508 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.685525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.685548 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.685565 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.788141 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.788168 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.788175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.788187 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.788195 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.891055 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.891106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.891169 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.891194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.891211 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.994331 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.994382 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.994399 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.994421 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:04 crc kubenswrapper[4956]: I0930 05:30:04.994439 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:04Z","lastTransitionTime":"2025-09-30T05:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.097723 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.097777 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.097794 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.097817 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.097837 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.200947 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.200990 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.201006 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.201027 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.201044 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.303517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.303596 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.303615 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.303647 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.303664 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.340288 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:05 crc kubenswrapper[4956]: E0930 05:30:05.340470 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.406726 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.406790 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.406814 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.406842 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.406865 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.509743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.509801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.509817 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.509852 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.509888 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.613828 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.613903 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.613927 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.613957 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.613981 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.717339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.717393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.717414 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.717446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.717468 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.821783 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.821849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.821868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.821893 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.821912 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.925699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.925784 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.925991 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.926030 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:05 crc kubenswrapper[4956]: I0930 05:30:05.926053 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:05Z","lastTransitionTime":"2025-09-30T05:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.029545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.029620 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.029643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.029672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.029689 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.133679 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.133758 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.133965 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.133991 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.134007 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.236576 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.236625 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.236641 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.236665 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.236683 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.339978 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.340043 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.340060 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.340085 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.340102 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.346469 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.346559 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.346486 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:06 crc kubenswrapper[4956]: E0930 05:30:06.346651 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:06 crc kubenswrapper[4956]: E0930 05:30:06.346767 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:06 crc kubenswrapper[4956]: E0930 05:30:06.346876 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.447003 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.447099 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.447160 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.447221 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.447244 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.550367 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.550436 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.550459 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.550488 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.550511 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.653854 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.653938 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.653963 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.653996 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.654019 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.757185 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.757649 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.757686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.757716 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.757738 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.860205 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.861101 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.861182 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.861208 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.861226 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.963768 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.963838 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.963853 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.963870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:06 crc kubenswrapper[4956]: I0930 05:30:06.963906 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:06Z","lastTransitionTime":"2025-09-30T05:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.067614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.067660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.067672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.067690 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.067703 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.170580 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.170651 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.170672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.170697 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.170714 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.274926 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.275004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.275028 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.275055 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.275076 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.340660 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:07 crc kubenswrapper[4956]: E0930 05:30:07.340898 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.377868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.377922 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.377938 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.377963 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.377981 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.481195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.481261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.481283 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.481313 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.481332 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.584508 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.584578 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.584603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.584632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.584653 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.627671 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.627724 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.627743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.627766 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.627785 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: E0930 05:30:07.645657 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:07Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.650308 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.650356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.650371 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.650391 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.650407 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: E0930 05:30:07.669445 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:07Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.673335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.673385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.673399 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.673417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.673434 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: E0930 05:30:07.686220 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:07Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.690224 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.690385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.690501 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.690615 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.690727 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: E0930 05:30:07.708804 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:07Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.712454 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.712571 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.712594 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.712621 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.712642 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: E0930 05:30:07.732880 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:07Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:07 crc kubenswrapper[4956]: E0930 05:30:07.733625 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.737178 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.737769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.737931 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.738169 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.738413 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.841938 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.842260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.842369 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.842466 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.842584 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.946283 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.947240 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.947438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.947579 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:07 crc kubenswrapper[4956]: I0930 05:30:07.947690 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:07Z","lastTransitionTime":"2025-09-30T05:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.051654 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.052145 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.052346 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.052533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.052699 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.156908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.156968 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.156985 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.157014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.157031 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.260844 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.261109 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.261264 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.261370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.261466 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.340437 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.340484 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.340529 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:08 crc kubenswrapper[4956]: E0930 05:30:08.340955 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:08 crc kubenswrapper[4956]: E0930 05:30:08.341359 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:08 crc kubenswrapper[4956]: E0930 05:30:08.341397 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.364746 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.364795 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.364811 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.364832 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.364850 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.468544 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.468619 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.468639 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.468661 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.468682 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.572274 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.572368 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.572385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.572408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.572424 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.674676 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.674715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.674725 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.674738 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.674747 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.777542 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.777596 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.777614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.777636 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.777652 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.880136 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.880185 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.880200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.880221 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.880238 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.982420 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.982468 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.982483 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.982503 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:08 crc kubenswrapper[4956]: I0930 05:30:08.982517 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:08Z","lastTransitionTime":"2025-09-30T05:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.084385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.084440 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.084459 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.084481 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.084496 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.187351 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.187396 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.187405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.187420 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.187429 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.290296 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.290339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.290349 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.290367 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.290378 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.340825 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:09 crc kubenswrapper[4956]: E0930 05:30:09.340952 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.392966 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.393004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.393014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.393030 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.393041 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.495964 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.496004 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.496017 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.496035 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.496048 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.599595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.599710 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.599732 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.599779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.599805 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.703591 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.703986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.704187 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.704343 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.704594 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.807523 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.808013 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.808050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.808078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.808095 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.911102 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.911171 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.911183 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.911200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:09 crc kubenswrapper[4956]: I0930 05:30:09.911212 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:09Z","lastTransitionTime":"2025-09-30T05:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.014208 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.015172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.015373 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.015555 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.015695 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.119780 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.119855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.119874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.119898 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.119915 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.223300 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.223341 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.223350 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.223363 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.223379 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.326540 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.326607 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.326633 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.326660 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.326682 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.340527 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.340538 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.340750 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:10 crc kubenswrapper[4956]: E0930 05:30:10.340706 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:10 crc kubenswrapper[4956]: E0930 05:30:10.341542 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:10 crc kubenswrapper[4956]: E0930 05:30:10.341717 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.359487 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.372637 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.384821 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.397580 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.417190 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.429240 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.429304 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.429323 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.429348 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.429367 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.429598 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.445100 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.473543 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.489683 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.508367 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.524883 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.532925 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.532970 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.532986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.533035 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.533051 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.537648 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.551240 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.564720 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.584974 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.599426 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.614467 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.628076 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:10Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.636174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.636233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.636254 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.636321 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.636347 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.738975 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.739028 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.739049 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.739076 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.739094 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.841952 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.842002 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.842014 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.842033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.842045 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.945210 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.945266 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.945284 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.945309 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:10 crc kubenswrapper[4956]: I0930 05:30:10.945326 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:10Z","lastTransitionTime":"2025-09-30T05:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.047281 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.047344 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.047362 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.047424 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.047444 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.149495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.149547 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.149566 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.149588 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.149604 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.252764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.252837 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.252855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.252880 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.252899 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.340763 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:11 crc kubenswrapper[4956]: E0930 05:30:11.341006 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.342266 4956 scope.go:117] "RemoveContainer" containerID="e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.354767 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.354881 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.354905 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.354933 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.354955 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.458072 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.458165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.458186 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.458210 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.458228 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.561598 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.561745 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.561836 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.561925 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.562017 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.664908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.664964 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.664977 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.664997 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.665011 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.755471 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/2.log" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.759606 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.760286 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.768824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.768870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.768888 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.768909 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.768926 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.778080 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.794816 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.814593 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.832923 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.854270 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.871245 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.871309 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.871343 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.871353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.871370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.871381 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.887378 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.897187 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.923293 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.934923 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.953719 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.966070 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.973466 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.973496 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.973507 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.973525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.973536 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:11Z","lastTransitionTime":"2025-09-30T05:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.979793 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:11 crc kubenswrapper[4956]: I0930 05:30:11.994229 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:11Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.007470 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.018379 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.030101 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.042038 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.076172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.076212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.076222 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.076235 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.076246 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.178885 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.178924 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.178935 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.178952 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.178964 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.281614 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.281652 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.281661 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.281674 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.281683 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.341048 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.341086 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:12 crc kubenswrapper[4956]: E0930 05:30:12.341213 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:12 crc kubenswrapper[4956]: E0930 05:30:12.341320 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.341487 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:12 crc kubenswrapper[4956]: E0930 05:30:12.341566 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.384762 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.384819 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.384837 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.384860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.384881 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.487409 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.487459 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.487471 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.487486 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.487498 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.590608 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.590671 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.590688 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.590713 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.590733 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.692875 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.692911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.692921 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.692933 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.692943 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.765404 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/3.log" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.766066 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/2.log" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.768607 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" exitCode=1 Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.768679 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.768753 4956 scope.go:117] "RemoveContainer" containerID="e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.770068 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:30:12 crc kubenswrapper[4956]: E0930 05:30:12.770442 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.783078 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.795977 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.797134 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.797176 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.797188 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.797205 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.797216 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.811235 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.829824 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.849531 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.867735 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.886902 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.899558 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.899616 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.899632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.899656 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.899674 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:12Z","lastTransitionTime":"2025-09-30T05:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.907862 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.921552 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.941259 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.970080 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e974da7572c4575dd3c31259a2c39841c0e6658aeb32a2343c45dd329912d837\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:45Z\\\",\\\"message\\\":\\\"etes/ovnkube-node-j8sw2 in node crc\\\\nI0930 05:29:45.262582 6695 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-j8sw2 after 0 failed attempt(s)\\\\nI0930 05:29:45.262586 6695 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-j8sw2\\\\nI0930 05:29:45.262594 6695 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0930 05:29:45.262597 6695 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:29:45Z is after 2025-08-24T17:21:41Z]\\\\nI0930 05:29:45.262605 6695 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:30:12Z\\\",\\\"message\\\":\\\", Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300777 7053 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.109\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300826 7053 services_controller.go:452] Built service openshift-kube-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0930 05:30:12.300762 7053 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:12 crc kubenswrapper[4956]: I0930 05:30:12.982256 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:12Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.003333 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.003419 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.003442 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.003467 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.003485 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.013369 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.029831 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.050088 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.066927 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.084873 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.101861 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.105761 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.105787 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.105796 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.105809 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.105818 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.208619 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.208665 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.208678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.208696 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.208707 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.311633 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.311677 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.311716 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.311738 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.311756 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.340849 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.341299 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.356906 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.414702 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.414761 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.414779 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.414802 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.414819 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.431784 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.431993 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.431956624 +0000 UTC m=+147.759077199 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.432053 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.432104 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.432205 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.432244 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432291 4956 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432320 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432326 4956 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432339 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432352 4956 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432369 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.432346108 +0000 UTC m=+147.759466673 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432404 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.432385499 +0000 UTC m=+147.759506064 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432431 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.4324169 +0000 UTC m=+147.759537465 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432494 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432540 4956 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432569 4956 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.432679 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.432649028 +0000 UTC m=+147.759769613 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.518103 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.518199 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.518217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.518242 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.518260 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.622441 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.622484 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.622500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.622521 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.622540 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.725703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.725777 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.725856 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.725911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.725938 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.775375 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/3.log" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.779018 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:30:13 crc kubenswrapper[4956]: E0930 05:30:13.779155 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.796669 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.812282 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.828957 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.828998 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.829013 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.829032 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.829045 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.828959 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.849666 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.873348 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.889171 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.909492 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.931655 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.931691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.931705 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.931721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.931734 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:13Z","lastTransitionTime":"2025-09-30T05:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.933725 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:30:12Z\\\",\\\"message\\\":\\\", Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300777 7053 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.109\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300826 7053 services_controller.go:452] Built service openshift-kube-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0930 05:30:12.300762 7053 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:30:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.947566 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.961319 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb0b6479-ba40-445f-a018-08a0689b9547\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0479f4bb7141ac2e5f5eda2994f1bca6f2b3ded35fce60581a8428858575bf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.978794 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:13 crc kubenswrapper[4956]: I0930 05:30:13.995296 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:13Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.009704 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.030078 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.034688 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.034725 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.034737 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.034752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.034764 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.045324 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.057975 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.070543 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.084140 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.098214 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:14Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.136581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.136647 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.136672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.136702 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.136729 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.239143 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.239237 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.239251 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.239295 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.239312 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.340775 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:14 crc kubenswrapper[4956]: E0930 05:30:14.341181 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.341258 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:14 crc kubenswrapper[4956]: E0930 05:30:14.341447 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.342487 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.342520 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.342533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.342551 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.342564 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.340765 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:14 crc kubenswrapper[4956]: E0930 05:30:14.343431 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.445401 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.445447 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.445460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.445477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.445489 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.547756 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.547813 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.547830 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.547853 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.547870 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.650603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.650662 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.650678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.650702 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.650720 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.753999 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.754072 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.754091 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.754145 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.754164 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.857477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.857515 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.857527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.857545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.857556 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.960724 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.960858 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.960882 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.960908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:14 crc kubenswrapper[4956]: I0930 05:30:14.960926 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:14Z","lastTransitionTime":"2025-09-30T05:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.063311 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.063359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.063371 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.063389 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.063401 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.167167 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.167213 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.167227 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.167249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.167264 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.269871 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.269943 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.269970 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.269998 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.270020 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.340029 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:15 crc kubenswrapper[4956]: E0930 05:30:15.340622 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.372552 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.372613 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.372639 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.372669 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.372690 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.474818 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.474853 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.474864 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.474877 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.474887 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.577967 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.578026 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.578048 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.578076 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.578099 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.681246 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.681306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.681326 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.681351 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.681369 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.784205 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.784278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.784303 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.784334 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.784359 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.886919 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.886979 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.886997 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.887021 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.887041 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.990358 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.990408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.990424 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.990452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:15 crc kubenswrapper[4956]: I0930 05:30:15.990469 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:15Z","lastTransitionTime":"2025-09-30T05:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.094505 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.094564 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.094581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.094609 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.094627 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.197777 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.197873 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.197887 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.197906 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.197917 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.301367 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.301444 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.301470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.301504 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.301522 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.340126 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.340148 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.340150 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:16 crc kubenswrapper[4956]: E0930 05:30:16.340239 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:16 crc kubenswrapper[4956]: E0930 05:30:16.340307 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:16 crc kubenswrapper[4956]: E0930 05:30:16.340406 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.404026 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.404068 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.404078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.404095 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.404106 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.506430 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.506464 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.506472 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.506487 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.506495 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.609527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.609575 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.609584 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.609601 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.609611 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.712278 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.712312 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.712319 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.712331 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.712339 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.814802 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.814839 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.814847 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.814858 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.814867 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.917420 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.917452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.917461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.917477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:16 crc kubenswrapper[4956]: I0930 05:30:16.917486 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:16Z","lastTransitionTime":"2025-09-30T05:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.020356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.020427 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.020447 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.020470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.020488 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.124656 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.124752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.124768 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.124789 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.124811 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.227666 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.227774 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.227792 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.228249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.228310 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.331709 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.331759 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.331771 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.331789 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.331801 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.339902 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:17 crc kubenswrapper[4956]: E0930 05:30:17.340018 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.434928 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.434992 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.435012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.435034 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.435052 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.540031 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.540068 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.540077 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.540091 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.540101 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.642960 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.643024 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.643044 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.643067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.643084 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.746630 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.746692 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.746709 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.746735 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.746753 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.848703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.848752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.848769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.848792 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.848808 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.952574 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.952628 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.952647 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.952670 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.952687 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.993637 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.993676 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.993693 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.993713 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:17 crc kubenswrapper[4956]: I0930 05:30:17.993729 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:17Z","lastTransitionTime":"2025-09-30T05:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.014188 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.020284 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.020373 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.020393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.020419 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.020444 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.041445 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.046769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.046828 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.046845 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.046869 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.046888 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.068308 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.075300 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.075359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.075379 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.075402 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.075419 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.095444 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.101028 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.101080 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.101097 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.101152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.101174 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.122814 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:18Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.123163 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.125555 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.125610 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.125632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.125665 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.125689 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.229106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.229225 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.229248 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.229279 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.229301 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.332601 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.332686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.332705 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.332729 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.332746 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.341390 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.341458 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.341486 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.341599 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.341733 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:18 crc kubenswrapper[4956]: E0930 05:30:18.341838 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.436020 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.436090 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.436156 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.436220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.436244 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.539338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.539384 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.539400 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.539420 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.539430 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.642060 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.642143 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.642165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.642189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.642207 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.745674 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.745747 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.745771 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.745802 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.745824 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.849070 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.849150 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.849168 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.849189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.849206 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.951356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.951393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.951460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.951480 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:18 crc kubenswrapper[4956]: I0930 05:30:18.951490 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:18Z","lastTransitionTime":"2025-09-30T05:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.054177 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.054212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.054220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.054232 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.054240 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.157539 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.157602 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.157619 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.157649 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.157672 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.260337 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.260410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.260430 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.260456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.260474 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.340919 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:19 crc kubenswrapper[4956]: E0930 05:30:19.341333 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.363264 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.363303 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.363311 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.363325 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.363335 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.466376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.466408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.466416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.466430 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.466440 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.568553 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.568635 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.568652 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.568677 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.568696 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.672098 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.672235 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.672265 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.672288 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.672305 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.775463 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.775530 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.775552 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.775576 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.775591 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.878555 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.878626 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.878649 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.878678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.878698 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.981975 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.982397 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.982553 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.982699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:19 crc kubenswrapper[4956]: I0930 05:30:19.982826 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:19Z","lastTransitionTime":"2025-09-30T05:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.085880 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.085934 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.085952 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.085975 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.085994 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.189318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.189377 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.189393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.189416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.189433 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.293389 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.293745 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.293764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.293788 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.293806 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.340943 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.341059 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:20 crc kubenswrapper[4956]: E0930 05:30:20.341192 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.341232 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:20 crc kubenswrapper[4956]: E0930 05:30:20.341406 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:20 crc kubenswrapper[4956]: E0930 05:30:20.341544 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.360466 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.384684 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.396487 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.396544 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.396567 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.396596 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.396618 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.406552 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.424071 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.443794 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.471845 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:30:12Z\\\",\\\"message\\\":\\\", Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300777 7053 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.109\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300826 7053 services_controller.go:452] Built service openshift-kube-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0930 05:30:12.300762 7053 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:30:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.487777 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.499687 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.499732 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.499742 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.499757 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.499767 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.519679 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.530472 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.543850 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.559846 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.570868 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb0b6479-ba40-445f-a018-08a0689b9547\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0479f4bb7141ac2e5f5eda2994f1bca6f2b3ded35fce60581a8428858575bf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.588844 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.602347 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.602631 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.602738 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.602868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.602962 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.607200 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.617928 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.628406 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.637749 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.649660 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.662198 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:20Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.705855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.705885 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.705893 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.705907 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.705917 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.808461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.808521 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.808539 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.808562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.808579 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.911279 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.911339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.911357 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.911381 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:20 crc kubenswrapper[4956]: I0930 05:30:20.911400 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:20Z","lastTransitionTime":"2025-09-30T05:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.014185 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.014244 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.014261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.014285 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.014303 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.117360 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.117423 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.117446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.117476 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.117495 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.219401 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.219428 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.219436 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.219449 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.219457 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.322135 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.322172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.322180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.322194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.322203 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.340740 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:21 crc kubenswrapper[4956]: E0930 05:30:21.340949 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.424775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.424848 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.424868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.424894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.424916 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.526911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.526970 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.526994 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.527022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.527042 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.630572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.630632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.630650 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.630673 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.630691 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.733799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.733843 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.733860 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.733882 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.733899 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.836274 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.836363 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.836387 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.836452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.836480 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.939484 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.939549 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.939566 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.939594 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:21 crc kubenswrapper[4956]: I0930 05:30:21.939614 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:21Z","lastTransitionTime":"2025-09-30T05:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.042013 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.042058 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.042069 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.042086 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.042097 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.144655 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.144714 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.144732 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.144756 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.144773 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.247780 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.247849 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.247873 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.247907 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.247932 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.340859 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.340924 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:22 crc kubenswrapper[4956]: E0930 05:30:22.341030 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.341202 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:22 crc kubenswrapper[4956]: E0930 05:30:22.341257 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:22 crc kubenswrapper[4956]: E0930 05:30:22.341505 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.350304 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.350377 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.350396 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.350417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.350436 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.453668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.453700 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.453711 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.453727 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.453738 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.556506 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.556533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.556543 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.556557 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.556567 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.659772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.659830 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.659847 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.659869 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.659889 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.762616 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.762683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.762704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.762729 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.762748 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.865628 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.865699 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.865720 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.865743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.865764 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.968814 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.968888 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.968912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.968942 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:22 crc kubenswrapper[4956]: I0930 05:30:22.968973 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:22Z","lastTransitionTime":"2025-09-30T05:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.071893 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.072263 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.072562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.072783 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.072934 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.176499 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.176847 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.177000 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.177183 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.177396 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.280248 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.280301 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.280320 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.280360 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.280394 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.340451 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:23 crc kubenswrapper[4956]: E0930 05:30:23.340690 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.383031 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.383094 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.383112 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.383166 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.383543 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.487287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.487343 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.487359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.487383 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.487401 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.590295 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.590363 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.590390 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.590417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.590439 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.692763 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.692822 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.692845 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.692870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.692885 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.796418 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.796461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.796502 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.796519 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.796534 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.899775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.899855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.899881 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.899912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:23 crc kubenswrapper[4956]: I0930 05:30:23.899935 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:23Z","lastTransitionTime":"2025-09-30T05:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.002868 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.002908 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.002919 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.002936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.002947 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.105985 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.106242 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.106318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.106493 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.106602 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.209078 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.209331 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.209380 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.209411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.209428 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.312034 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.312149 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.312175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.312206 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.312227 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.340973 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.340978 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.341089 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:24 crc kubenswrapper[4956]: E0930 05:30:24.341293 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:24 crc kubenswrapper[4956]: E0930 05:30:24.341413 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:24 crc kubenswrapper[4956]: E0930 05:30:24.341579 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.415522 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.415567 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.415581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.415599 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.415611 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.518181 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.518244 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.518262 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.518288 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.518306 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.620377 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.620434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.620450 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.620469 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.620483 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.722766 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.722913 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.722926 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.722945 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.722956 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.824912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.824974 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.824989 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.825011 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.825029 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.927743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.927805 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.927824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.927848 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:24 crc kubenswrapper[4956]: I0930 05:30:24.927866 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:24Z","lastTransitionTime":"2025-09-30T05:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.030189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.030235 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.030248 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.030271 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.030286 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.132103 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.132185 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.132199 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.132220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.132234 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.235787 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.235829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.235842 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.235865 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.235881 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.338601 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.338643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.338654 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.338668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.338680 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.340873 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:25 crc kubenswrapper[4956]: E0930 05:30:25.341095 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.342575 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:30:25 crc kubenswrapper[4956]: E0930 05:30:25.342804 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.441346 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.441403 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.441414 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.441431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.441441 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.543724 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.543773 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.543790 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.543813 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.543829 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.646524 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.646564 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.646574 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.646589 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.646603 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.748896 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.748936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.748948 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.748967 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.748979 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.851047 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.851105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.851157 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.851186 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.851205 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.953981 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.954048 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.954067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.954092 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:25 crc kubenswrapper[4956]: I0930 05:30:25.954111 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:25Z","lastTransitionTime":"2025-09-30T05:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.056742 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.056796 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.056807 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.056830 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.056841 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.159801 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.159864 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.159882 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.159906 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.159923 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.263410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.263477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.263500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.263526 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.263547 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.340108 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.340190 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:26 crc kubenswrapper[4956]: E0930 05:30:26.340450 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:26 crc kubenswrapper[4956]: E0930 05:30:26.340563 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.340293 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:26 crc kubenswrapper[4956]: E0930 05:30:26.340962 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.366456 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.366501 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.366513 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.366531 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.366543 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.470090 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.470189 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.470208 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.470232 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.470252 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.572969 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.573072 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.573165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.573359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.573460 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.676429 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.676484 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.676501 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.676526 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.676542 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.779831 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.779878 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.779889 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.779912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.779934 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.883571 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.883618 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.883630 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.883646 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.883658 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.986930 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.986990 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.987002 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.987021 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:26 crc kubenswrapper[4956]: I0930 05:30:26.987033 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:26Z","lastTransitionTime":"2025-09-30T05:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.090517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.090590 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.090609 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.090634 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.090653 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.193553 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.193624 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.193645 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.193674 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.193697 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.296540 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.296603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.296619 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.296645 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.296662 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.340211 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:27 crc kubenswrapper[4956]: E0930 05:30:27.340539 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.399310 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.399376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.399398 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.399426 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.399448 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.502178 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.502220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.502230 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.502247 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.502257 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.605033 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.605165 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.605197 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.605227 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.605249 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.708212 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.708261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.708298 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.708317 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.708329 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.810551 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.810598 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.810612 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.810630 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.810642 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.912940 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.912977 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.912986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.912999 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:27 crc kubenswrapper[4956]: I0930 05:30:27.913008 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:27Z","lastTransitionTime":"2025-09-30T05:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.014913 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.014953 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.014962 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.014977 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.014985 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.118436 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.118517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.118542 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.118573 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.118593 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.196519 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.196682 4956 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.196774 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs podName:184140db-c30d-4f88-89ff-b7aa2dcca3d1 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:32.196752783 +0000 UTC m=+162.523873318 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs") pod "network-metrics-daemon-ctwgh" (UID: "184140db-c30d-4f88-89ff-b7aa2dcca3d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.221848 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.221921 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.221938 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.221964 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.221987 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.325332 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.325418 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.325438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.325463 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.325481 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.340087 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.340204 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.340437 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.340413 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.340903 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.340972 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.353940 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.353990 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.354010 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.354028 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.354041 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.371634 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:28Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.375678 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.375724 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.375741 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.375762 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.375777 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.392061 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:28Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.396993 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.397041 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.397056 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.397077 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.397095 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.413309 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:28Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.421478 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.421518 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.421532 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.421549 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.421562 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.439017 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:28Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.442992 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.443178 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.443205 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.443233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.443259 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.461532 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:28Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:28 crc kubenswrapper[4956]: E0930 05:30:28.461642 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.463440 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.463499 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.463512 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.463532 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.463543 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.566054 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.566127 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.566141 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.566160 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.566173 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.669326 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.669370 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.669378 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.669410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.669423 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.772413 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.772491 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.772517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.772552 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.772581 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.874603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.874653 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.874663 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.874675 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.874685 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.977838 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.977897 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.977925 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.977951 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:28 crc kubenswrapper[4956]: I0930 05:30:28.977969 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:28Z","lastTransitionTime":"2025-09-30T05:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.081147 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.081220 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.081241 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.081272 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.081294 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.184652 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.184694 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.184704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.184720 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.184729 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.287840 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.287895 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.287911 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.287936 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.287953 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.339941 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:29 crc kubenswrapper[4956]: E0930 05:30:29.340168 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.391145 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.391210 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.391231 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.391257 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.391274 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.494102 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.494178 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.494195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.494217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.494233 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.597580 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.597643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.597665 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.597689 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.597706 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.701260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.701318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.701336 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.701358 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.701376 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.805003 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.805050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.805064 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.805085 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.805100 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.908276 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.908330 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.908345 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.908371 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:29 crc kubenswrapper[4956]: I0930 05:30:29.908390 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:29Z","lastTransitionTime":"2025-09-30T05:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.011942 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.012019 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.012036 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.012060 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.012077 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.115320 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.115358 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.115368 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.115382 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.115392 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.218621 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.218667 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.218682 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.218704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.218721 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.321826 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.321869 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.321879 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.321896 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.321908 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.340908 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:30 crc kubenswrapper[4956]: E0930 05:30:30.341160 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.341274 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.341306 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:30 crc kubenswrapper[4956]: E0930 05:30:30.341428 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:30 crc kubenswrapper[4956]: E0930 05:30:30.341575 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.358762 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb0b6479-ba40-445f-a018-08a0689b9547\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0479f4bb7141ac2e5f5eda2994f1bca6f2b3ded35fce60581a8428858575bf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.379683 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.398942 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.415160 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.430829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.430916 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.430940 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.430970 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.430991 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.433618 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.453743 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.469712 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.484192 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.499394 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.513913 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.532668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.532695 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.532704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.532716 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.532725 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.533883 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.547625 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.565089 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.586440 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.624511 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.635392 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.635433 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.635443 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.635462 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.635474 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.644763 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.660152 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.678223 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:30:12Z\\\",\\\"message\\\":\\\", Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300777 7053 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.109\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300826 7053 services_controller.go:452] Built service openshift-kube-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0930 05:30:12.300762 7053 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:30:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.689068 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:30Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.737927 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.737961 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.737972 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.737988 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.738000 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.840692 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.840732 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.840746 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.840766 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.840780 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.942431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.942469 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.942482 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.942498 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:30 crc kubenswrapper[4956]: I0930 05:30:30.942508 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:30Z","lastTransitionTime":"2025-09-30T05:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.045005 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.045039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.045050 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.045067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.045079 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.148082 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.148226 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.148250 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.148277 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.148297 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.251260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.251347 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.251371 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.251393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.251408 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.340990 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:31 crc kubenswrapper[4956]: E0930 05:30:31.341202 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.354056 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.354098 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.354108 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.354148 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.354160 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.456722 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.456769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.456785 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.456809 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.456827 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.560308 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.560375 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.560393 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.560415 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.560433 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.663586 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.663672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.663692 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.663715 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.663731 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.766200 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.766232 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.766241 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.766253 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.766263 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.868667 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.868725 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.868741 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.868765 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.868789 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.971258 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.971315 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.971332 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.971356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:31 crc kubenswrapper[4956]: I0930 05:30:31.971372 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:31Z","lastTransitionTime":"2025-09-30T05:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.080547 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.080617 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.080637 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.080664 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.080681 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.182867 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.182903 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.182912 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.182924 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.182933 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.284795 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.284854 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.284872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.284895 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.284913 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.340634 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.340723 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.340658 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:32 crc kubenswrapper[4956]: E0930 05:30:32.340832 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:32 crc kubenswrapper[4956]: E0930 05:30:32.340933 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:32 crc kubenswrapper[4956]: E0930 05:30:32.341005 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.387672 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.387799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.387819 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.387841 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.387858 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.490632 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.490691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.490713 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.490736 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.490758 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.594688 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.594754 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.594772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.594794 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.594811 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.698084 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.698188 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.698226 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.698263 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.698284 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.801237 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.801313 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.801339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.801368 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.801390 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.904996 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.905060 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.905099 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.905180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:32 crc kubenswrapper[4956]: I0930 05:30:32.905209 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:32Z","lastTransitionTime":"2025-09-30T05:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.007487 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.007544 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.007562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.007588 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.007605 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.110635 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.110694 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.110713 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.110737 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.110759 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.213629 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.213704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.213724 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.213750 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.213768 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.317013 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.317055 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.317066 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.317081 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.317092 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.341063 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:33 crc kubenswrapper[4956]: E0930 05:30:33.341289 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.419509 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.419579 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.419604 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.419637 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.419661 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.523045 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.523111 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.523174 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.523199 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.523217 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.625941 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.625992 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.626012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.626038 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.626056 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.729431 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.729487 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.729509 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.729538 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.729560 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.832357 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.832417 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.832440 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.832468 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.832528 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.936173 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.936238 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.936261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.936288 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:33 crc kubenswrapper[4956]: I0930 05:30:33.936308 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:33Z","lastTransitionTime":"2025-09-30T05:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.039525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.039589 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.039615 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.039644 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.039664 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.142730 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.142794 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.142817 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.142846 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.142867 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.245353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.245434 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.245458 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.245572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.245599 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.340624 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.340666 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:34 crc kubenswrapper[4956]: E0930 05:30:34.340797 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.340652 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:34 crc kubenswrapper[4956]: E0930 05:30:34.340907 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:34 crc kubenswrapper[4956]: E0930 05:30:34.340962 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.347951 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.347982 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.347993 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.348007 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.348017 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.449810 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.449872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.449883 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.449898 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.449911 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.552695 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.552729 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.552737 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.552750 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.552759 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.655070 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.655105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.655134 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.655190 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.655202 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.758008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.758072 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.758090 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.758155 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.758177 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.860799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.860870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.860892 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.860918 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.860934 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.963945 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.964016 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.964039 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.964067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:34 crc kubenswrapper[4956]: I0930 05:30:34.964090 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:34Z","lastTransitionTime":"2025-09-30T05:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.067206 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.067260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.067277 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.067300 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.067317 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.170577 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.170668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.170686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.170709 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.170727 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.274339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.274495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.274529 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.274558 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.274581 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.340822 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:35 crc kubenswrapper[4956]: E0930 05:30:35.341065 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.377012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.377061 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.377077 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.377101 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.377148 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.480184 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.480289 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.480315 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.480345 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.480365 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.583098 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.583151 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.583159 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.583172 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.583180 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.685626 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.685692 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.685717 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.685745 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.685769 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.788777 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.788815 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.788824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.788839 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.788849 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.891335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.891455 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.891475 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.891497 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.891514 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.994648 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.994703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.994721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.994743 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:35 crc kubenswrapper[4956]: I0930 05:30:35.994759 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:35Z","lastTransitionTime":"2025-09-30T05:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.098077 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.098168 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.098367 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.098391 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.098409 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.200703 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.200753 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.200769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.200793 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.200812 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.303438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.303493 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.303510 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.303531 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.303550 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.340503 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.340582 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.340530 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:36 crc kubenswrapper[4956]: E0930 05:30:36.340718 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:36 crc kubenswrapper[4956]: E0930 05:30:36.340879 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:36 crc kubenswrapper[4956]: E0930 05:30:36.341017 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.406754 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.406821 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.406844 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.406872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.406896 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.509916 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.509981 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.510001 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.510023 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.510041 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.613448 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.613527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.613545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.613571 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.613591 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.717046 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.717104 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.717161 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.717192 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.717209 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.819799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.819854 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.819872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.819895 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.819913 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.923086 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.923180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.923198 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.923225 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:36 crc kubenswrapper[4956]: I0930 05:30:36.923242 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:36Z","lastTransitionTime":"2025-09-30T05:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.026081 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.026190 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.026216 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.026246 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.026270 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.129327 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.129385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.129402 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.129428 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.129445 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.232545 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.232652 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.232683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.232714 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.232739 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.335629 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.335696 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.335721 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.335751 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.335768 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.340327 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:37 crc kubenswrapper[4956]: E0930 05:30:37.340828 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.438287 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.438345 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.438362 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.438385 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.438407 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.541084 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.541275 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.541304 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.541334 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.541355 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.645421 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.645503 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.645525 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.645560 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.645582 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.748490 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.748546 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.748563 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.748588 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.748605 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.851810 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.851872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.851894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.851922 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.851964 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.954958 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.955001 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.955012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.955029 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:37 crc kubenswrapper[4956]: I0930 05:30:37.955041 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:37Z","lastTransitionTime":"2025-09-30T05:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.058767 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.058824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.058841 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.058865 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.058882 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.161461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.161516 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.161535 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.161558 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.161575 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.265059 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.265151 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.265169 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.265192 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.265209 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.339992 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.340196 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.340385 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.340559 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.340612 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.340720 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.368474 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.368580 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.368601 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.368664 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.368682 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.471355 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.471400 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.471410 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.471428 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.471440 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.574821 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.574898 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.574922 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.574956 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.574979 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.678183 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.678233 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.678253 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.678277 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.678295 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.717938 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.717999 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.718017 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.718040 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.718059 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.740650 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:38Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.745999 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.746227 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.746249 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.746273 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.746289 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.761041 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:38Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.766821 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.767271 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.767486 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.767662 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.767834 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.788011 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:38Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.792280 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.792335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.792353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.792378 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.792395 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.808982 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:38Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.814683 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.814737 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.814751 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.814772 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.814788 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.828239 4956 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T05:30:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a1a66a95-8ae8-4f6f-8bdf-6a49eb9ca091\\\",\\\"systemUUID\\\":\\\"623396aa-9d66-4fad-a73b-3a90f4645680\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:38Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:38 crc kubenswrapper[4956]: E0930 05:30:38.828414 4956 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.830201 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.830308 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.830382 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.830474 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.830584 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.933611 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.934168 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.934254 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.934354 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:38 crc kubenswrapper[4956]: I0930 05:30:38.934442 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:38Z","lastTransitionTime":"2025-09-30T05:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.036984 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.037571 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.038105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.038605 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.039104 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.143009 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.143077 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.143098 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.143164 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.143187 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.245473 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.245540 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.245562 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.245588 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.245606 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.341760 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:39 crc kubenswrapper[4956]: E0930 05:30:39.341982 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.342447 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:30:39 crc kubenswrapper[4956]: E0930 05:30:39.342802 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.348224 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.348281 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.348306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.348335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.348358 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.450794 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.450851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.450871 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.450895 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.450912 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.554191 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.554445 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.554517 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.554580 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.554639 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.657933 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.657988 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.658005 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.658027 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.658044 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.760775 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.760815 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.760824 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.760838 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.760849 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.864104 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.864153 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.864162 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.864175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.864185 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.967108 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.968186 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.968339 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.968515 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:39 crc kubenswrapper[4956]: I0930 05:30:39.968649 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:39Z","lastTransitionTime":"2025-09-30T05:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.076270 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.076550 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.076633 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.076717 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.076804 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.179840 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.180245 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.180399 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.180522 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.180641 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.283107 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.283195 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.283207 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.283223 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.283234 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.340420 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.340420 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:40 crc kubenswrapper[4956]: E0930 05:30:40.340843 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.340531 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:40 crc kubenswrapper[4956]: E0930 05:30:40.340772 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:40 crc kubenswrapper[4956]: E0930 05:30:40.341103 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.363875 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f7698c0-e774-40d2-9fa6-920802db5a79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05125a1f272dfcd9ebfc6a3911e48e60f7d8df9e110c8bf0b8f901b23c2ed5c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d749715749a160cef6389e02b8dc9d646d9aded49ea925ee9d8a2e213c9d1cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eb7b9775c791788e87334b7028fc32fc5046479487c48bcc6fd1e97a619a1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43dc2fcc360758ac7f75ff69cd41c01992f6c300a83e35e94c6a542ea87fafb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://880c4ed5ea4d7715cf38f7911c9ccbe36ffb04581384e2145b431ffe3a1027c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c81d56dcee873c1a42b3d5db4671ad6b4c5bda3ef0d498369f325fc243e548c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f739eedcf5ff2fae6c69a85bb70e987140434c78860f26124ebdad68c7f654c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ef6d2f02f252fa64f54b1b1f68531e2edd371d50c483eea75297ee6d9284215\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.376103 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ee57fb585cb9bf3af8fd5964b85856ac2426e588a35647473c6b9c8bbf0192\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.385267 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.385300 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.385314 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.385333 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.385347 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.389677 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-frfx9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72ad9902-843c-4117-9ac1-c34d525c9d55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:29:57Z\\\",\\\"message\\\":\\\"2025-09-30T05:29:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73\\\\n2025-09-30T05:29:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_066c011d-8b4f-4792-8a83-9e5af895bc73 to /host/opt/cni/bin/\\\\n2025-09-30T05:29:12Z [verbose] multus-daemon started\\\\n2025-09-30T05:29:12Z [verbose] Readiness Indicator file check\\\\n2025-09-30T05:29:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwjr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-frfx9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.409062 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29df1c73-1262-4143-b710-bc690edc2ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T05:30:12Z\\\",\\\"message\\\":\\\", Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300777 7053 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.109\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 05:30:12.300826 7053 services_controller.go:452] Built service openshift-kube-apiserver-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0930 05:30:12.300762 7053 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:30:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j8sw2_openshift-ovn-kubernetes(29df1c73-1262-4143-b710-bc690edc2ab8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p5xxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j8sw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.421360 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d99135-2439-4836-8a69-f4cabc091bb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0d6bc0a50e9977a4372f01a7a77404aad0b9faa1b7c47b7620d47fade0a84ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f625b23e389e00b2adf1c17941712bab69b4047e8bac7c84ad0655fb825cc11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nf2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x27rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.430899 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb0b6479-ba40-445f-a018-08a0689b9547\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0479f4bb7141ac2e5f5eda2994f1bca6f2b3ded35fce60581a8428858575bf4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e464b77d04edaec7bd1158adcf6a0b18f0baa4417c81703570f156e7165877c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.443848 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.457473 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.470303 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ecd015b-e216-40d8-ae78-711b2a65c193\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d498526e38bfe036310b23be3e65fb99b6ae04609bd2016def7b2acae8a627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sjdq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hx8cm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.487308 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.487343 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.487353 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.487404 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.487417 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.488361 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e2b15ac-7c8d-4cb4-9fc6-433ae3283c42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a40e0f643b09dc39b0b9a136d47c6a792de2fb40f31f7f6d59ff439dc810c018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10514dbcddad1899a263fbe9151c1496310515d2b8a89a8c7fcad1f01a802bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e6e362eb04eaf3b264d161ba7d3f216bfb12f37bfd0126f20242afdee801eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40dc990a45c4c169d589759948f3cab1e024c1e0b001ddce987dd9fad7229575\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.506594 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3109317236b7e13a2c9d2b9bc3f4bde29e292a7dd129a7311c1b9bea436767b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.522470 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2089c731380aaac222dcc86bbf379c2305a9cbe18ba2834a34999983edd5bd62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1949314421e589774766dd4a17887197a01f98ca6a1a27c62ac3a405d13a5ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.534071 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-htk97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da40fd61-e4f1-4780-bf28-5dd931e1a265\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab6d5ca6dd8848183bf5f3419e49139b19887e12ae47f6224fa9e745e58c9100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjqdh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-htk97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.546644 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xlssx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3be91b1-5806-4319-b8e2-71d37a81bc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3cbae63fb862112285e0e610a0a052214ccc68826f0a795fa145d4ad1cfd6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjcqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xlssx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.558233 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184140db-c30d-4f88-89ff-b7aa2dcca3d1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbkvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ctwgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.577692 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"263b8334-2a9e-4160-9c3e-ec165c71f0c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a04770963ab95796377dc4735a4e41360f72aa6b48bd8901a90ae1dd37b2b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc54adf8d4b353d8afec6dc4b1185e3a7591b3360170c7a1048f551e47ea64d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a648addd424c762f7df4c779a821b5d8da4e403face1a66b0fa9ef166ca117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aff6b3c560151012c3b63d768f2e855c47a1cca196d7a827bab3e35d2aa3b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e44d0d0b593b90b1ac4042f0f3ee3aa1a597ddb3935ed46c77d15e8cff982b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T05:29:04Z\\\",\\\"message\\\":\\\"W0930 05:28:53.534166 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 05:28:53.534597 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759210133 cert, and key in /tmp/serving-cert-2470931049/serving-signer.crt, /tmp/serving-cert-2470931049/serving-signer.key\\\\nI0930 05:28:53.896073 1 observer_polling.go:159] Starting file observer\\\\nW0930 05:28:53.897770 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 05:28:53.897860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 05:28:53.899091 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2470931049/tls.crt::/tmp/serving-cert-2470931049/tls.key\\\\\\\"\\\\nF0930 05:29:04.170505 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c82f30df87e4ce565adfd3f26b1a78d9893835175e49347cbcc2dbab4c53460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a673613faa7393ba0aa9d962b03b7fea87c3007d803565fced83671ca608118\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.590377 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.590427 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.590441 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.590460 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.590473 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.593866 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"496197d9-0c88-4554-8dd0-340d3e708d13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:28:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28d80a22b077888b63a98fda3bd9bd79b67d66522e83805c9d1e90bf1d9341b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cfec1fc2efa8398dde0bbf6da8eecd9434e2f71a302e8a31c97ba4f40518159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd798aaae48e1da6ede523546d7ed11a29d1b400c818e1dc4e3b8dd47f6beaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d68445467f7dbae067d5f1ca010fd9dad8cc2dfca755ab68ce39c6c6dc592a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:28:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:28:51Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:28:50Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.609971 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.628942 4956 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f38dd558-4728-4f7d-b69c-a523b09af345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T05:29:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32451a951837ff47313358287b7965508f737cb5f4ab91f20e808920d007108\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T05:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://172182a7ff1b449e23ce90bde0ec26d2ec3f988f5ef58fdd723db01397b050b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be22340290837aa3143463f202720de800fb03bf0ff6d7b335021974cef2d517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8addaa694cf320321dafe9183163a814b571005da9ebea40dae91a66f9c1f8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e025f93518095dce9210789a91b0d53819bafcc0b764618d74e24e39deff873\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efb204cc632f76aaf33c7d10b17424cbd9972a5effbebbf012ac8aea01663d7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99030ce3b87304262545407d5b4778cd349368e4a2c201ea341eb0735e8be662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T05:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T05:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z98c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T05:29:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lpcwf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T05:30:40Z is after 2025-08-24T17:21:41Z" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.693527 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.693575 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.693595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.693616 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.693633 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.796388 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.796465 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.796494 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.796526 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.796553 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.899040 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.899075 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.899083 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.899095 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:40 crc kubenswrapper[4956]: I0930 05:30:40.899105 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:40Z","lastTransitionTime":"2025-09-30T05:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.001288 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.001883 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.002012 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.002196 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.002338 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.104796 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.104841 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.104850 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.104866 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.104898 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.207644 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.207682 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.207694 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.207711 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.207722 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.309939 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.310365 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.310581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.310988 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.311403 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.341110 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:41 crc kubenswrapper[4956]: E0930 05:30:41.341506 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.414986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.415483 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.415636 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.415780 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.415912 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.518910 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.519480 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.519617 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.519799 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.519924 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.622984 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.623037 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.623049 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.623067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.623082 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.725847 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.725900 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.725917 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.725939 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.725956 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.829264 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.829875 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.830079 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.830299 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.830463 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.933425 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.933485 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.933502 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.933531 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:41 crc kubenswrapper[4956]: I0930 05:30:41.933551 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:41Z","lastTransitionTime":"2025-09-30T05:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.036767 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.037986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.038175 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.038345 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.038653 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.142486 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.142606 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.142630 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.142656 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.142672 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.246376 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.246446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.246470 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.246500 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.246525 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.340858 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.340946 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.341070 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:42 crc kubenswrapper[4956]: E0930 05:30:42.341060 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:42 crc kubenswrapper[4956]: E0930 05:30:42.341298 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:42 crc kubenswrapper[4956]: E0930 05:30:42.341422 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.349214 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.349267 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.349284 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.349308 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.349328 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.452359 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.452422 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.452440 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.452464 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.452482 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.555446 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.555507 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.555524 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.555548 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.555566 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.659643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.659711 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.659735 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.659764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.659786 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.762736 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.762792 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.762809 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.762829 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.762841 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.865847 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.865899 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.865915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.865938 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.865955 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.969346 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.969406 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.969429 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.969461 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:42 crc kubenswrapper[4956]: I0930 05:30:42.969483 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:42Z","lastTransitionTime":"2025-09-30T05:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.072259 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.072855 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.073015 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.073187 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.073280 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.178021 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.178091 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.178152 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.178183 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.178202 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.281356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.281604 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.281784 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.281957 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.282102 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.340400 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:43 crc kubenswrapper[4956]: E0930 05:30:43.340599 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.385290 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.385352 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.385367 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.385388 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.385406 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.488180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.488332 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.488432 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.488575 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.488691 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.591627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.591915 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.592102 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.592334 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.592481 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.695412 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.695466 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.695483 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.695506 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.695523 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.797763 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.797803 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.797816 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.797836 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.797847 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.900259 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.900750 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.900771 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.900793 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:43 crc kubenswrapper[4956]: I0930 05:30:43.900816 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:43Z","lastTransitionTime":"2025-09-30T05:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.003704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.003752 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.003767 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.003789 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.003804 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.109613 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.109662 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.109680 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.109716 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.109737 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.211884 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.211929 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.211941 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.211958 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.211969 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.314452 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.314533 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.314547 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.314563 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.314575 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.341020 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.341038 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.341216 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:44 crc kubenswrapper[4956]: E0930 05:30:44.341454 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:44 crc kubenswrapper[4956]: E0930 05:30:44.341480 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:44 crc kubenswrapper[4956]: E0930 05:30:44.341534 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.416574 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.416627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.416655 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.416671 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.416682 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.519443 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.519499 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.519507 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.519522 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.519535 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.621851 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.621887 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.621898 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.621913 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.621925 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.724877 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.724953 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.724973 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.724997 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.725016 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.827080 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.827134 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.827146 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.827160 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.827168 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.883671 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/1.log" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.884242 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/0.log" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.884289 4956 generic.go:334] "Generic (PLEG): container finished" podID="72ad9902-843c-4117-9ac1-c34d525c9d55" containerID="13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119" exitCode=1 Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.884337 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-frfx9" event={"ID":"72ad9902-843c-4117-9ac1-c34d525c9d55","Type":"ContainerDied","Data":"13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.884378 4956 scope.go:117] "RemoveContainer" containerID="7de87d49b617f0d9c8a902132e8b6bbf62557b4d82948fe206145caeb3793d3b" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.885976 4956 scope.go:117] "RemoveContainer" containerID="13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119" Sep 30 05:30:44 crc kubenswrapper[4956]: E0930 05:30:44.886295 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-frfx9_openshift-multus(72ad9902-843c-4117-9ac1-c34d525c9d55)\"" pod="openshift-multus/multus-frfx9" podUID="72ad9902-843c-4117-9ac1-c34d525c9d55" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.929603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.929654 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.929668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.929689 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.929703 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:44Z","lastTransitionTime":"2025-09-30T05:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.962754 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=96.962725677 podStartE2EDuration="1m36.962725677s" podCreationTimestamp="2025-09-30 05:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:44.960799981 +0000 UTC m=+115.287920576" watchObservedRunningTime="2025-09-30 05:30:44.962725677 +0000 UTC m=+115.289846242" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.963082 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lpcwf" podStartSLOduration=94.963069309 podStartE2EDuration="1m34.963069309s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:44.942745925 +0000 UTC m=+115.269866510" watchObservedRunningTime="2025-09-30 05:30:44.963069309 +0000 UTC m=+115.290189864" Sep 30 05:30:44 crc kubenswrapper[4956]: I0930 05:30:44.977554 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.977534213 podStartE2EDuration="1m3.977534213s" podCreationTimestamp="2025-09-30 05:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:44.977100767 +0000 UTC m=+115.304221332" watchObservedRunningTime="2025-09-30 05:30:44.977534213 +0000 UTC m=+115.304654758" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.031836 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.031866 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.031874 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.031886 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.031895 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.059107 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x27rf" podStartSLOduration=94.059090677 podStartE2EDuration="1m34.059090677s" podCreationTimestamp="2025-09-30 05:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:45.035054666 +0000 UTC m=+115.362175211" watchObservedRunningTime="2025-09-30 05:30:45.059090677 +0000 UTC m=+115.386211202" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.059464 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=94.059459699 podStartE2EDuration="1m34.059459699s" podCreationTimestamp="2025-09-30 05:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:45.058252618 +0000 UTC m=+115.385373193" watchObservedRunningTime="2025-09-30 05:30:45.059459699 +0000 UTC m=+115.386580234" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.092450 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podStartSLOduration=95.092428315 podStartE2EDuration="1m35.092428315s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:45.091507073 +0000 UTC m=+115.418627618" watchObservedRunningTime="2025-09-30 05:30:45.092428315 +0000 UTC m=+115.419548860" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.106185 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=32.106164604 podStartE2EDuration="32.106164604s" podCreationTimestamp="2025-09-30 05:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:45.106017229 +0000 UTC m=+115.433137764" watchObservedRunningTime="2025-09-30 05:30:45.106164604 +0000 UTC m=+115.433285139" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.134356 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.134388 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.134397 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.134409 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.134418 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.153562 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-htk97" podStartSLOduration=96.153546402 podStartE2EDuration="1m36.153546402s" podCreationTimestamp="2025-09-30 05:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:45.141869243 +0000 UTC m=+115.468989768" watchObservedRunningTime="2025-09-30 05:30:45.153546402 +0000 UTC m=+115.480666927" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.153649 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xlssx" podStartSLOduration=95.153644805 podStartE2EDuration="1m35.153644805s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:45.15288977 +0000 UTC m=+115.480010295" watchObservedRunningTime="2025-09-30 05:30:45.153644805 +0000 UTC m=+115.480765350" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.177571 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=94.177550062 podStartE2EDuration="1m34.177550062s" podCreationTimestamp="2025-09-30 05:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:45.176312209 +0000 UTC m=+115.503432734" watchObservedRunningTime="2025-09-30 05:30:45.177550062 +0000 UTC m=+115.504670607" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.236416 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.236463 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.236477 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.236494 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.236506 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.338585 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.338906 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.339106 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.339438 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.339582 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.339932 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:45 crc kubenswrapper[4956]: E0930 05:30:45.340070 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.442029 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.442071 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.442081 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.442095 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.442104 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.543643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.543682 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.543691 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.543704 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.543713 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.646260 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.646294 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.646306 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.646324 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.646336 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.749329 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.749388 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.749414 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.749444 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.749467 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.852031 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.852089 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.852105 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.852168 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.852187 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.890331 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/1.log" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.955008 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.955067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.955084 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.955109 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:45 crc kubenswrapper[4956]: I0930 05:30:45.955175 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:45Z","lastTransitionTime":"2025-09-30T05:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.057698 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.057751 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.057767 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.057793 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.057810 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.161170 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.161227 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.161240 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.161261 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.161274 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.263546 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.263603 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.263620 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.263643 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.263659 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.340751 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.340843 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.340855 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:46 crc kubenswrapper[4956]: E0930 05:30:46.340991 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:46 crc kubenswrapper[4956]: E0930 05:30:46.341199 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:46 crc kubenswrapper[4956]: E0930 05:30:46.341373 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.366203 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.366318 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.366338 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.366408 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.366426 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.469155 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.469218 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.469230 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.469246 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.469256 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.572375 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.572497 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.572515 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.572538 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.572557 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.675986 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.676102 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.676166 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.676194 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.676217 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.779496 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.779559 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.779764 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.779792 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.779815 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.883289 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.883323 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.883335 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.883350 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.883361 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.986465 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.986518 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.986535 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.986561 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:46 crc kubenswrapper[4956]: I0930 05:30:46.986580 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:46Z","lastTransitionTime":"2025-09-30T05:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.089022 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.089089 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.089108 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.089166 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.089189 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.192818 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.192870 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.192894 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.192914 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.192928 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.295279 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.295349 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.295379 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.295411 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.295429 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.340377 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:47 crc kubenswrapper[4956]: E0930 05:30:47.340525 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.398577 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.398647 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.398666 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.398690 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.398706 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.506963 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.507067 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.507085 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.507140 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.507160 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.610303 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.610368 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.610382 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.610405 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.610428 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.712791 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.712864 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.712887 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.712917 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.712939 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.815511 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.815572 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.815590 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.815616 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.815638 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.918086 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.918142 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.918155 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.918177 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:47 crc kubenswrapper[4956]: I0930 05:30:47.918187 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:47Z","lastTransitionTime":"2025-09-30T05:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.021163 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.021204 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.021217 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.021234 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.021247 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.124495 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.124581 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.124613 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.124627 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.124637 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.227595 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.227669 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.227686 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.227710 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.227730 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.330872 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.330923 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.330939 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.330956 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.330968 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.340556 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.340613 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.340563 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:48 crc kubenswrapper[4956]: E0930 05:30:48.340719 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:48 crc kubenswrapper[4956]: E0930 05:30:48.340854 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:48 crc kubenswrapper[4956]: E0930 05:30:48.340995 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.433342 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.433392 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.433403 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.433419 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.433431 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.536292 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.536343 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.536354 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.536372 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.536384 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.638663 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.638726 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.638744 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.638769 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.638785 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.740935 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.740992 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.741011 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.741034 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.741054 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.844180 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.844241 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.844256 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.844280 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.844296 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.854668 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.854705 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.854716 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.854732 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.854743 4956 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T05:30:48Z","lastTransitionTime":"2025-09-30T05:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.910473 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx"] Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.910857 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.913276 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.913620 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.913916 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.914087 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.940692 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b2f43b5-d16a-49c2-a4bb-2d736198d704-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.940757 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2b2f43b5-d16a-49c2-a4bb-2d736198d704-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.940815 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2b2f43b5-d16a-49c2-a4bb-2d736198d704-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.940905 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b2f43b5-d16a-49c2-a4bb-2d736198d704-service-ca\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:48 crc kubenswrapper[4956]: I0930 05:30:48.940989 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b2f43b5-d16a-49c2-a4bb-2d736198d704-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.042264 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2b2f43b5-d16a-49c2-a4bb-2d736198d704-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.042329 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2b2f43b5-d16a-49c2-a4bb-2d736198d704-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.042355 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b2f43b5-d16a-49c2-a4bb-2d736198d704-service-ca\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.042390 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b2f43b5-d16a-49c2-a4bb-2d736198d704-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.042441 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b2f43b5-d16a-49c2-a4bb-2d736198d704-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.042473 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2b2f43b5-d16a-49c2-a4bb-2d736198d704-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.042505 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2b2f43b5-d16a-49c2-a4bb-2d736198d704-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.043387 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b2f43b5-d16a-49c2-a4bb-2d736198d704-service-ca\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.049525 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b2f43b5-d16a-49c2-a4bb-2d736198d704-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.066749 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b2f43b5-d16a-49c2-a4bb-2d736198d704-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-82jjx\" (UID: \"2b2f43b5-d16a-49c2-a4bb-2d736198d704\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.234202 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.341067 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:49 crc kubenswrapper[4956]: E0930 05:30:49.341289 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.905038 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" event={"ID":"2b2f43b5-d16a-49c2-a4bb-2d736198d704","Type":"ContainerStarted","Data":"563655f2c219db8251bcc7d04e801e529d3703f251aa9efe3cef471ca30ec10a"} Sep 30 05:30:49 crc kubenswrapper[4956]: I0930 05:30:49.905093 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" event={"ID":"2b2f43b5-d16a-49c2-a4bb-2d736198d704","Type":"ContainerStarted","Data":"9a6699ae8d342465ca8e1f9998d2b4a0a66e0de0bf4466433c9d77a195ead88c"} Sep 30 05:30:50 crc kubenswrapper[4956]: E0930 05:30:50.284653 4956 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 05:30:50 crc kubenswrapper[4956]: I0930 05:30:50.340658 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:50 crc kubenswrapper[4956]: I0930 05:30:50.340700 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:50 crc kubenswrapper[4956]: E0930 05:30:50.341789 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:50 crc kubenswrapper[4956]: I0930 05:30:50.341840 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:50 crc kubenswrapper[4956]: E0930 05:30:50.342011 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:50 crc kubenswrapper[4956]: E0930 05:30:50.342102 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:50 crc kubenswrapper[4956]: E0930 05:30:50.479868 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 05:30:51 crc kubenswrapper[4956]: I0930 05:30:51.340046 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:51 crc kubenswrapper[4956]: E0930 05:30:51.340267 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:52 crc kubenswrapper[4956]: I0930 05:30:52.340487 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:52 crc kubenswrapper[4956]: I0930 05:30:52.340534 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:52 crc kubenswrapper[4956]: I0930 05:30:52.340658 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:52 crc kubenswrapper[4956]: E0930 05:30:52.340669 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:52 crc kubenswrapper[4956]: E0930 05:30:52.340846 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:52 crc kubenswrapper[4956]: E0930 05:30:52.340940 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:53 crc kubenswrapper[4956]: I0930 05:30:53.340711 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:53 crc kubenswrapper[4956]: E0930 05:30:53.341172 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:53 crc kubenswrapper[4956]: I0930 05:30:53.341369 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:30:53 crc kubenswrapper[4956]: I0930 05:30:53.919912 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/3.log" Sep 30 05:30:53 crc kubenswrapper[4956]: I0930 05:30:53.922248 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerStarted","Data":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} Sep 30 05:30:53 crc kubenswrapper[4956]: I0930 05:30:53.923351 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:30:53 crc kubenswrapper[4956]: I0930 05:30:53.944655 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-82jjx" podStartSLOduration=103.944637797 podStartE2EDuration="1m43.944637797s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:49.923434535 +0000 UTC m=+120.250555140" watchObservedRunningTime="2025-09-30 05:30:53.944637797 +0000 UTC m=+124.271758342" Sep 30 05:30:54 crc kubenswrapper[4956]: I0930 05:30:54.250634 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podStartSLOduration=104.250616423 podStartE2EDuration="1m44.250616423s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:53.945475376 +0000 UTC m=+124.272595911" watchObservedRunningTime="2025-09-30 05:30:54.250616423 +0000 UTC m=+124.577736948" Sep 30 05:30:54 crc kubenswrapper[4956]: I0930 05:30:54.250947 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ctwgh"] Sep 30 05:30:54 crc kubenswrapper[4956]: I0930 05:30:54.251021 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:54 crc kubenswrapper[4956]: E0930 05:30:54.251099 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:54 crc kubenswrapper[4956]: I0930 05:30:54.340170 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:54 crc kubenswrapper[4956]: I0930 05:30:54.340209 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:54 crc kubenswrapper[4956]: I0930 05:30:54.340171 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:54 crc kubenswrapper[4956]: E0930 05:30:54.340279 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:54 crc kubenswrapper[4956]: E0930 05:30:54.340352 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:54 crc kubenswrapper[4956]: E0930 05:30:54.340502 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:55 crc kubenswrapper[4956]: E0930 05:30:55.480848 4956 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 05:30:56 crc kubenswrapper[4956]: I0930 05:30:56.340546 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:56 crc kubenswrapper[4956]: I0930 05:30:56.340621 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:56 crc kubenswrapper[4956]: I0930 05:30:56.340629 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:56 crc kubenswrapper[4956]: E0930 05:30:56.340705 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:56 crc kubenswrapper[4956]: I0930 05:30:56.340550 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:56 crc kubenswrapper[4956]: E0930 05:30:56.341008 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:56 crc kubenswrapper[4956]: E0930 05:30:56.341045 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:56 crc kubenswrapper[4956]: E0930 05:30:56.341404 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:58 crc kubenswrapper[4956]: I0930 05:30:58.340372 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:30:58 crc kubenswrapper[4956]: E0930 05:30:58.340606 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:30:58 crc kubenswrapper[4956]: I0930 05:30:58.340945 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:30:58 crc kubenswrapper[4956]: E0930 05:30:58.341050 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:30:58 crc kubenswrapper[4956]: I0930 05:30:58.341313 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:30:58 crc kubenswrapper[4956]: E0930 05:30:58.341444 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:30:58 crc kubenswrapper[4956]: I0930 05:30:58.341619 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:30:58 crc kubenswrapper[4956]: E0930 05:30:58.342052 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:30:58 crc kubenswrapper[4956]: I0930 05:30:58.342484 4956 scope.go:117] "RemoveContainer" containerID="13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119" Sep 30 05:30:58 crc kubenswrapper[4956]: I0930 05:30:58.941167 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/1.log" Sep 30 05:30:58 crc kubenswrapper[4956]: I0930 05:30:58.941234 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-frfx9" event={"ID":"72ad9902-843c-4117-9ac1-c34d525c9d55","Type":"ContainerStarted","Data":"910f9a9e20921fea2d407aca2be189b0e75ce142086dfbbea368233848f74b83"} Sep 30 05:30:58 crc kubenswrapper[4956]: I0930 05:30:58.969540 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-frfx9" podStartSLOduration=108.96951691699999 podStartE2EDuration="1m48.969516917s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:30:58.968854884 +0000 UTC m=+129.295975439" watchObservedRunningTime="2025-09-30 05:30:58.969516917 +0000 UTC m=+129.296637472" Sep 30 05:30:59 crc kubenswrapper[4956]: I0930 05:30:59.397111 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:31:00 crc kubenswrapper[4956]: I0930 05:31:00.341223 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:31:00 crc kubenswrapper[4956]: I0930 05:31:00.341244 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:31:00 crc kubenswrapper[4956]: E0930 05:31:00.341641 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ctwgh" podUID="184140db-c30d-4f88-89ff-b7aa2dcca3d1" Sep 30 05:31:00 crc kubenswrapper[4956]: I0930 05:31:00.341347 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:31:00 crc kubenswrapper[4956]: E0930 05:31:00.341754 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 05:31:00 crc kubenswrapper[4956]: I0930 05:31:00.341284 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:31:00 crc kubenswrapper[4956]: E0930 05:31:00.341836 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 05:31:00 crc kubenswrapper[4956]: E0930 05:31:00.341880 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.340348 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.340456 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.340386 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.340392 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.344417 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.344892 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.345698 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.345923 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.346038 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 05:31:02 crc kubenswrapper[4956]: I0930 05:31:02.346454 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.462515 4956 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.522895 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mrbck"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.523503 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.525010 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s62b4"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.526056 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.526809 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-442db"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.527306 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.532718 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9c8r"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.533208 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.534218 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.534271 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.534542 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.537603 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.538620 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.538698 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539063 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539210 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539356 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539420 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539462 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539482 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539604 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539642 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539608 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539740 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539828 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539848 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539921 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539936 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.539985 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.540014 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.541016 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.543003 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.549780 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.550742 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.558229 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rtd48"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.558340 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.558820 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.566499 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.567855 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.567913 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.567961 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.567859 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.569383 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.569412 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.569917 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.570401 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.572178 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.574474 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ztg8w"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.574948 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.578959 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2kcxx"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.579663 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.580957 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.581404 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.588094 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.592026 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.593101 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.593574 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.594707 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.595405 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.596602 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.597045 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.597066 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.598440 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.598624 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.599568 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.601685 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.620220 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.620395 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wjwhf"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.620418 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.620972 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.621296 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.622826 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.623023 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.623903 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.625532 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zhkrt"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.625804 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mwj5h"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.626314 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.626331 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.626804 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.628849 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.629532 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4fn5g"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.630516 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.631618 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.641842 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.642229 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.642441 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.642585 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.642772 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.642878 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.643376 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5kh24"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.643662 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.635707 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.631699 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.644167 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.644280 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.644482 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.644679 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.645046 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.645492 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.635716 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.635804 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.635838 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.646129 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.635924 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.636016 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.646306 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.636044 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.636106 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.646489 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.637250 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.637723 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.637807 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.637828 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.638244 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.638402 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.638480 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.638521 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.638556 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.638671 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.639670 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.640402 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.648895 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.648927 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.649070 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.649206 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.649514 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.649831 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.650206 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.650313 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.650423 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.650458 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.650431 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.651100 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.651425 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.651793 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.652334 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.658157 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.658822 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.658933 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.662104 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.682813 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.683275 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xg6dj"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.692405 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.693108 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.693479 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.693773 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.694248 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-t5shm"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.694754 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.695660 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.700216 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-etcd-service-ca\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701216 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701247 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-config\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701259 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701264 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-config\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701404 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-audit\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701422 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wc4v\" (UniqueName: \"kubernetes.io/projected/2ad2e251-9549-4f18-8467-8dc98924bc23-kube-api-access-9wc4v\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701440 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-images\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701455 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3c797d8-62e0-4ed4-b376-ac0aba58185e-auth-proxy-config\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701471 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btlxj\" (UniqueName: \"kubernetes.io/projected/d3c797d8-62e0-4ed4-b376-ac0aba58185e-kube-api-access-btlxj\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701486 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8sd5\" (UniqueName: \"kubernetes.io/projected/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-kube-api-access-w8sd5\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701501 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-serving-cert\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701522 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701539 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-service-ca\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701558 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f29793e9-ea42-4820-8c66-a8f861e0370a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701504 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701576 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-client-ca\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701619 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-etcd-ca\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701637 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-config\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701652 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701669 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-config\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701687 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2fm\" (UniqueName: \"kubernetes.io/projected/15065912-8d28-47dd-adaf-f7aedc754526-kube-api-access-kr2fm\") pod \"dns-operator-744455d44c-mwj5h\" (UID: \"15065912-8d28-47dd-adaf-f7aedc754526\") " pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701706 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-encryption-config\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701725 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-trusted-ca\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701745 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw4km\" (UniqueName: \"kubernetes.io/projected/90e0bca8-7e63-438d-95ca-7b855c885655-kube-api-access-bw4km\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701763 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-serving-cert\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701780 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-config\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701841 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee01037-07c8-471e-860f-801980e351f8-serving-cert\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701867 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ee01037-07c8-471e-860f-801980e351f8-etcd-client\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701884 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-trusted-ca-bundle\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701946 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90e0bca8-7e63-438d-95ca-7b855c885655-audit-dir\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701967 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad2e251-9549-4f18-8467-8dc98924bc23-serving-cert\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.701987 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d43f9c60-0360-4486-9c31-ecebf460d114-audit-dir\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702013 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-oauth-serving-cert\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702037 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-oauth-config\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702055 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-config\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702094 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-serving-cert\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702144 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702162 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702181 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txmfc\" (UniqueName: \"kubernetes.io/projected/f91889a1-69c9-48ae-a920-00b05adab7e5-kube-api-access-txmfc\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702201 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-audit-policies\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702218 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3c797d8-62e0-4ed4-b376-ac0aba58185e-config\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702293 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f91889a1-69c9-48ae-a920-00b05adab7e5-serving-cert\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702323 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39bae241-73b1-4078-9861-309c762b38b5-serving-cert\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702341 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-config\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702358 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050f9843-c228-4681-96e9-8649f7eff6fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702381 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8vpq\" (UniqueName: \"kubernetes.io/projected/050f9843-c228-4681-96e9-8649f7eff6fc-kube-api-access-k8vpq\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702417 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15065912-8d28-47dd-adaf-f7aedc754526-metrics-tls\") pod \"dns-operator-744455d44c-mwj5h\" (UID: \"15065912-8d28-47dd-adaf-f7aedc754526\") " pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702434 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-etcd-client\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702490 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90e0bca8-7e63-438d-95ca-7b855c885655-node-pullsecrets\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702530 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29793e9-ea42-4820-8c66-a8f861e0370a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702548 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l997k\" (UniqueName: \"kubernetes.io/projected/39bae241-73b1-4078-9861-309c762b38b5-kube-api-access-l997k\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702564 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-client-ca\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702581 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-image-import-ca\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702605 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d3c797d8-62e0-4ed4-b376-ac0aba58185e-machine-approver-tls\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702621 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2ad2e251-9549-4f18-8467-8dc98924bc23-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702637 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-encryption-config\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702662 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702670 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhj9x\" (UniqueName: \"kubernetes.io/projected/d43f9c60-0360-4486-9c31-ecebf460d114-kube-api-access-vhj9x\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702722 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-serving-cert\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702741 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-etcd-serving-ca\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702755 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702793 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szmpz\" (UniqueName: \"kubernetes.io/projected/9ee01037-07c8-471e-860f-801980e351f8-kube-api-access-szmpz\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702810 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-etcd-client\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702827 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-service-ca-bundle\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702857 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x8bg\" (UniqueName: \"kubernetes.io/projected/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-kube-api-access-6x8bg\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702874 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njg99\" (UniqueName: \"kubernetes.io/projected/f29793e9-ea42-4820-8c66-a8f861e0370a-kube-api-access-njg99\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702891 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-config\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.702906 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h78br\" (UniqueName: \"kubernetes.io/projected/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-kube-api-access-h78br\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.703155 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.705502 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.706162 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.706558 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.708392 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-44xzl"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.708576 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.708805 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.708954 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.709244 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.709378 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.711418 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.712598 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.713655 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.714280 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.716073 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvfx8"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.716582 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.717042 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.717693 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.718024 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rtd48"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.720810 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.721957 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.724594 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.725307 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.725312 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.725883 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.726384 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.730180 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zhkrt"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.733368 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.733981 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.736057 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.736660 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.738752 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-v25lf"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.739246 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v25lf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.741352 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mrbck"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.745604 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.749715 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2kcxx"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.749758 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.749769 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s62b4"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.752659 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ztg8w"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.752691 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8658c"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.755543 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.755568 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.755629 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.761701 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.763700 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-44xzl"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.765069 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.765270 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.766220 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wjwhf"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.767345 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9c8r"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.768441 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.769945 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.771157 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.772192 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4fn5g"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.773293 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5kh24"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.774579 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.775539 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mwj5h"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.780126 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-t5shm"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.782003 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.786614 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.786710 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.789461 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.791241 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.792208 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.793213 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.794165 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.796316 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.797571 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvfx8"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.799148 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.800507 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.802028 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-69hd4"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.803240 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.803792 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-config\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.803874 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h78br\" (UniqueName: \"kubernetes.io/projected/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-kube-api-access-h78br\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.803905 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hk4n\" (UniqueName: \"kubernetes.io/projected/aa301454-b9a6-4bed-acd1-f2cf109b5259-kube-api-access-2hk4n\") pod \"control-plane-machine-set-operator-78cbb6b69f-pclsc\" (UID: \"aa301454-b9a6-4bed-acd1-f2cf109b5259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.803945 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghzpt\" (UniqueName: \"kubernetes.io/projected/c0770b87-e3c3-49d5-a290-3a3cd64367a7-kube-api-access-ghzpt\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.803969 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0770b87-e3c3-49d5-a290-3a3cd64367a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.803987 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-srv-cert\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804018 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-etcd-service-ca\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804042 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/140b2851-bf05-4ec3-87db-657aacefdbd4-secret-volume\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804062 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjnd\" (UniqueName: \"kubernetes.io/projected/bcb2fabd-b2e0-41f3-8677-e55602876ca9-kube-api-access-psjnd\") pod \"multus-admission-controller-857f4d67dd-t5shm\" (UID: \"bcb2fabd-b2e0-41f3-8677-e55602876ca9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804085 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa301454-b9a6-4bed-acd1-f2cf109b5259-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pclsc\" (UID: \"aa301454-b9a6-4bed-acd1-f2cf109b5259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804124 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804159 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-config\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804180 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-config\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804202 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-audit\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804224 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wc4v\" (UniqueName: \"kubernetes.io/projected/2ad2e251-9549-4f18-8467-8dc98924bc23-kube-api-access-9wc4v\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804246 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8sd5\" (UniqueName: \"kubernetes.io/projected/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-kube-api-access-w8sd5\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804268 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-serving-cert\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804291 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-images\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804312 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3c797d8-62e0-4ed4-b376-ac0aba58185e-auth-proxy-config\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804332 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btlxj\" (UniqueName: \"kubernetes.io/projected/d3c797d8-62e0-4ed4-b376-ac0aba58185e-kube-api-access-btlxj\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804378 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj5zd\" (UniqueName: \"kubernetes.io/projected/983d58af-8688-438e-8f8c-c658d1a9bdac-kube-api-access-fj5zd\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804404 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-service-ca\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804426 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f29793e9-ea42-4820-8c66-a8f861e0370a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804485 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-client-ca\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804511 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804542 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlb7v\" (UniqueName: \"kubernetes.io/projected/3a4863df-527f-4038-be9f-286437603a44-kube-api-access-vlb7v\") pod \"migrator-59844c95c7-5v6z6\" (UID: \"3a4863df-527f-4038-be9f-286437603a44\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804592 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-etcd-ca\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804614 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8969ec-fae5-48f6-afe3-f230deb5f802-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j9dnt\" (UID: \"1e8969ec-fae5-48f6-afe3-f230deb5f802\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804637 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-config\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804659 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-config\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804681 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804705 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ae876-4694-4cca-92e9-80528dc98bb4-config\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804731 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2fm\" (UniqueName: \"kubernetes.io/projected/15065912-8d28-47dd-adaf-f7aedc754526-kube-api-access-kr2fm\") pod \"dns-operator-744455d44c-mwj5h\" (UID: \"15065912-8d28-47dd-adaf-f7aedc754526\") " pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804753 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-encryption-config\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804772 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-trusted-ca\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804792 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804816 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a1ba28-4b12-49ea-a04d-a9ab766a187f-proxy-tls\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804838 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804861 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw4km\" (UniqueName: \"kubernetes.io/projected/90e0bca8-7e63-438d-95ca-7b855c885655-kube-api-access-bw4km\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804892 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-serving-cert\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804912 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-config\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804933 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-trusted-ca-bundle\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804953 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee01037-07c8-471e-860f-801980e351f8-serving-cert\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804973 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ee01037-07c8-471e-860f-801980e351f8-etcd-client\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.804994 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d43f9c60-0360-4486-9c31-ecebf460d114-audit-dir\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805015 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90e0bca8-7e63-438d-95ca-7b855c885655-audit-dir\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805035 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad2e251-9549-4f18-8467-8dc98924bc23-serving-cert\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805058 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-oauth-serving-cert\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805079 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee7ae876-4694-4cca-92e9-80528dc98bb4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805102 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-oauth-config\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805140 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-config\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805172 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-serving-cert\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805194 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805231 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805254 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805278 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txmfc\" (UniqueName: \"kubernetes.io/projected/f91889a1-69c9-48ae-a920-00b05adab7e5-kube-api-access-txmfc\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805300 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ae876-4694-4cca-92e9-80528dc98bb4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805321 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805342 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3c797d8-62e0-4ed4-b376-ac0aba58185e-config\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805365 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-audit-policies\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805389 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2774c9ff-8b7d-4937-a6bb-86746bd4829f-signing-key\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805416 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f91889a1-69c9-48ae-a920-00b05adab7e5-serving-cert\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805465 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85jdx\" (UniqueName: \"kubernetes.io/projected/56a1ba28-4b12-49ea-a04d-a9ab766a187f-kube-api-access-85jdx\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.806287 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-config\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.806586 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.807043 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-service-ca\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.807244 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-trusted-ca-bundle\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.808052 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d43f9c60-0360-4486-9c31-ecebf460d114-audit-dir\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.808139 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90e0bca8-7e63-438d-95ca-7b855c885655-audit-dir\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.808474 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3c797d8-62e0-4ed4-b376-ac0aba58185e-config\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.808699 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.808726 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-config\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.808879 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.808905 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-client-ca\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.808911 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-images\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.809429 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-etcd-ca\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.805382 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l45c8"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.809874 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9ee01037-07c8-471e-860f-801980e351f8-etcd-service-ca\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.810207 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.810231 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v25lf"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.810301 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l45c8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.810467 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3c797d8-62e0-4ed4-b376-ac0aba58185e-auth-proxy-config\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.810935 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-config\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.810980 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39bae241-73b1-4078-9861-309c762b38b5-serving-cert\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811001 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-config\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811021 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050f9843-c228-4681-96e9-8649f7eff6fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811038 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8vpq\" (UniqueName: \"kubernetes.io/projected/050f9843-c228-4681-96e9-8649f7eff6fc-kube-api-access-k8vpq\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811062 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811087 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdsn\" (UniqueName: \"kubernetes.io/projected/2774c9ff-8b7d-4937-a6bb-86746bd4829f-kube-api-access-9wdsn\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811105 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811141 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2774c9ff-8b7d-4937-a6bb-86746bd4829f-signing-cabundle\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811165 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15065912-8d28-47dd-adaf-f7aedc754526-metrics-tls\") pod \"dns-operator-744455d44c-mwj5h\" (UID: \"15065912-8d28-47dd-adaf-f7aedc754526\") " pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811190 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811251 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-etcd-client\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811270 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0770b87-e3c3-49d5-a290-3a3cd64367a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811289 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90e0bca8-7e63-438d-95ca-7b855c885655-node-pullsecrets\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811305 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56a1ba28-4b12-49ea-a04d-a9ab766a187f-images\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811321 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106d5b0b-b745-4129-b766-ea2ab60c3569-config\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811339 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzkk\" (UniqueName: \"kubernetes.io/projected/106d5b0b-b745-4129-b766-ea2ab60c3569-kube-api-access-qxzkk\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811365 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29793e9-ea42-4820-8c66-a8f861e0370a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811385 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l997k\" (UniqueName: \"kubernetes.io/projected/39bae241-73b1-4078-9861-309c762b38b5-kube-api-access-l997k\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811404 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-client-ca\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811420 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-image-import-ca\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811439 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjt74\" (UniqueName: \"kubernetes.io/projected/1e8969ec-fae5-48f6-afe3-f230deb5f802-kube-api-access-mjt74\") pod \"package-server-manager-789f6589d5-j9dnt\" (UID: \"1e8969ec-fae5-48f6-afe3-f230deb5f802\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811571 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ee01037-07c8-471e-860f-801980e351f8-etcd-client\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.811786 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-config\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.812272 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l45c8"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.812587 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-audit\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.812806 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-config\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.813048 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-audit-policies\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.813304 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-image-import-ca\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.813446 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-config\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.813537 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-serving-cert\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.813795 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-encryption-config\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.813862 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29793e9-ea42-4820-8c66-a8f861e0370a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.813989 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814331 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f91889a1-69c9-48ae-a920-00b05adab7e5-serving-cert\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814418 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-config\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814495 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtlz5\" (UniqueName: \"kubernetes.io/projected/140b2851-bf05-4ec3-87db-657aacefdbd4-kube-api-access-jtlz5\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814509 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-69hd4"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814545 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d3c797d8-62e0-4ed4-b376-ac0aba58185e-machine-approver-tls\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814658 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f29793e9-ea42-4820-8c66-a8f861e0370a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814662 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-client-ca\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814732 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/90e0bca8-7e63-438d-95ca-7b855c885655-node-pullsecrets\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814751 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2ad2e251-9549-4f18-8467-8dc98924bc23-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814791 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-encryption-config\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814866 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56a1ba28-4b12-49ea-a04d-a9ab766a187f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814956 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-trusted-ca\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.814956 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d43f9c60-0360-4486-9c31-ecebf460d114-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815009 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815030 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-etcd-client\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815084 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhj9x\" (UniqueName: \"kubernetes.io/projected/d43f9c60-0360-4486-9c31-ecebf460d114-kube-api-access-vhj9x\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815140 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106d5b0b-b745-4129-b766-ea2ab60c3569-serving-cert\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815251 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-serving-cert\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815295 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-etcd-serving-ca\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815293 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2ad2e251-9549-4f18-8467-8dc98924bc23-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815374 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815626 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ad2e251-9549-4f18-8467-8dc98924bc23-serving-cert\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815821 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15065912-8d28-47dd-adaf-f7aedc754526-metrics-tls\") pod \"dns-operator-744455d44c-mwj5h\" (UID: \"15065912-8d28-47dd-adaf-f7aedc754526\") " pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815885 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-etcd-serving-ca\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.815885 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816023 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-serving-cert\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816048 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0770b87-e3c3-49d5-a290-3a3cd64367a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816062 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39bae241-73b1-4078-9861-309c762b38b5-serving-cert\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816097 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-service-ca-bundle\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816151 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8r2d\" (UniqueName: \"kubernetes.io/projected/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-kube-api-access-z8r2d\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816265 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szmpz\" (UniqueName: \"kubernetes.io/projected/9ee01037-07c8-471e-860f-801980e351f8-kube-api-access-szmpz\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816304 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-etcd-client\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816635 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-oauth-serving-cert\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816668 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-config\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816732 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e0bca8-7e63-438d-95ca-7b855c885655-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816737 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x8bg\" (UniqueName: \"kubernetes.io/projected/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-kube-api-access-6x8bg\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816849 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njg99\" (UniqueName: \"kubernetes.io/projected/f29793e9-ea42-4820-8c66-a8f861e0370a-kube-api-access-njg99\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816915 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcb2fabd-b2e0-41f3-8677-e55602876ca9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-t5shm\" (UID: \"bcb2fabd-b2e0-41f3-8677-e55602876ca9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.816941 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwr5\" (UniqueName: \"kubernetes.io/projected/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-kube-api-access-tqwr5\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.817325 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-config\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.817371 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w22x5"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.817617 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee01037-07c8-471e-860f-801980e351f8-serving-cert\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.817970 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d3c797d8-62e0-4ed4-b376-ac0aba58185e-machine-approver-tls\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.818151 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-serving-cert\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.818207 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.818335 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w22x5"] Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.818448 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-oauth-config\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.819036 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90e0bca8-7e63-438d-95ca-7b855c885655-etcd-client\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.819788 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d43f9c60-0360-4486-9c31-ecebf460d114-encryption-config\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.820342 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-serving-cert\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.826493 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.828620 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050f9843-c228-4681-96e9-8649f7eff6fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.836774 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91889a1-69c9-48ae-a920-00b05adab7e5-service-ca-bundle\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.845456 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.865538 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.891012 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.905333 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917649 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917675 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0770b87-e3c3-49d5-a290-3a3cd64367a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917695 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106d5b0b-b745-4129-b766-ea2ab60c3569-serving-cert\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917722 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8r2d\" (UniqueName: \"kubernetes.io/projected/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-kube-api-access-z8r2d\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917756 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcb2fabd-b2e0-41f3-8677-e55602876ca9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-t5shm\" (UID: \"bcb2fabd-b2e0-41f3-8677-e55602876ca9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917775 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwr5\" (UniqueName: \"kubernetes.io/projected/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-kube-api-access-tqwr5\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917799 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hk4n\" (UniqueName: \"kubernetes.io/projected/aa301454-b9a6-4bed-acd1-f2cf109b5259-kube-api-access-2hk4n\") pod \"control-plane-machine-set-operator-78cbb6b69f-pclsc\" (UID: \"aa301454-b9a6-4bed-acd1-f2cf109b5259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917823 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghzpt\" (UniqueName: \"kubernetes.io/projected/c0770b87-e3c3-49d5-a290-3a3cd64367a7-kube-api-access-ghzpt\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917845 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0770b87-e3c3-49d5-a290-3a3cd64367a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917898 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/140b2851-bf05-4ec3-87db-657aacefdbd4-secret-volume\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917914 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-srv-cert\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917935 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psjnd\" (UniqueName: \"kubernetes.io/projected/bcb2fabd-b2e0-41f3-8677-e55602876ca9-kube-api-access-psjnd\") pod \"multus-admission-controller-857f4d67dd-t5shm\" (UID: \"bcb2fabd-b2e0-41f3-8677-e55602876ca9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.917983 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa301454-b9a6-4bed-acd1-f2cf109b5259-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pclsc\" (UID: \"aa301454-b9a6-4bed-acd1-f2cf109b5259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918015 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj5zd\" (UniqueName: \"kubernetes.io/projected/983d58af-8688-438e-8f8c-c658d1a9bdac-kube-api-access-fj5zd\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918031 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918048 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8969ec-fae5-48f6-afe3-f230deb5f802-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j9dnt\" (UID: \"1e8969ec-fae5-48f6-afe3-f230deb5f802\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918066 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlb7v\" (UniqueName: \"kubernetes.io/projected/3a4863df-527f-4038-be9f-286437603a44-kube-api-access-vlb7v\") pod \"migrator-59844c95c7-5v6z6\" (UID: \"3a4863df-527f-4038-be9f-286437603a44\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918081 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ae876-4694-4cca-92e9-80528dc98bb4-config\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918099 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918134 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a1ba28-4b12-49ea-a04d-a9ab766a187f-proxy-tls\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918149 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918251 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee7ae876-4694-4cca-92e9-80528dc98bb4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918267 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918293 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ae876-4694-4cca-92e9-80528dc98bb4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918308 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918325 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2774c9ff-8b7d-4937-a6bb-86746bd4829f-signing-key\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918347 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85jdx\" (UniqueName: \"kubernetes.io/projected/56a1ba28-4b12-49ea-a04d-a9ab766a187f-kube-api-access-85jdx\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918362 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-config\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918376 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918391 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdsn\" (UniqueName: \"kubernetes.io/projected/2774c9ff-8b7d-4937-a6bb-86746bd4829f-kube-api-access-9wdsn\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918406 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0770b87-e3c3-49d5-a290-3a3cd64367a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918420 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918436 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2774c9ff-8b7d-4937-a6bb-86746bd4829f-signing-cabundle\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918451 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56a1ba28-4b12-49ea-a04d-a9ab766a187f-images\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918479 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106d5b0b-b745-4129-b766-ea2ab60c3569-config\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918494 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzkk\" (UniqueName: \"kubernetes.io/projected/106d5b0b-b745-4129-b766-ea2ab60c3569-kube-api-access-qxzkk\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918511 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjt74\" (UniqueName: \"kubernetes.io/projected/1e8969ec-fae5-48f6-afe3-f230deb5f802-kube-api-access-mjt74\") pod \"package-server-manager-789f6589d5-j9dnt\" (UID: \"1e8969ec-fae5-48f6-afe3-f230deb5f802\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918529 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtlz5\" (UniqueName: \"kubernetes.io/projected/140b2851-bf05-4ec3-87db-657aacefdbd4-kube-api-access-jtlz5\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.918550 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56a1ba28-4b12-49ea-a04d-a9ab766a187f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.919143 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56a1ba28-4b12-49ea-a04d-a9ab766a187f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.919321 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.919986 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-config\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.921832 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.921855 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.946919 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.967458 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.980352 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ae876-4694-4cca-92e9-80528dc98bb4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:09 crc kubenswrapper[4956]: I0930 05:31:09.985529 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.005797 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.009482 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ae876-4694-4cca-92e9-80528dc98bb4-config\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.025382 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.045391 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.066146 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.086193 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.105892 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.126331 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.146727 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.166336 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.185314 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.206192 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.226016 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.245508 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.266765 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.286021 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.306084 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.325503 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.346903 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.373474 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.381253 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0770b87-e3c3-49d5-a290-3a3cd64367a7-trusted-ca\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.386175 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.405700 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.411025 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0770b87-e3c3-49d5-a290-3a3cd64367a7-metrics-tls\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.426237 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.445980 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.465943 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.471979 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcb2fabd-b2e0-41f3-8677-e55602876ca9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-t5shm\" (UID: \"bcb2fabd-b2e0-41f3-8677-e55602876ca9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.486262 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.506103 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.527148 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.546029 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.565875 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.572377 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/140b2851-bf05-4ec3-87db-657aacefdbd4-secret-volume\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.575555 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.586549 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.606553 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.613806 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8969ec-fae5-48f6-afe3-f230deb5f802-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j9dnt\" (UID: \"1e8969ec-fae5-48f6-afe3-f230deb5f802\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.626345 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.646727 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.653857 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2774c9ff-8b7d-4937-a6bb-86746bd4829f-signing-key\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.665684 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.670428 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2774c9ff-8b7d-4937-a6bb-86746bd4829f-signing-cabundle\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.686522 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.706109 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.724276 4956 request.go:700] Waited for 1.01455354s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.725439 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.746075 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.750629 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56a1ba28-4b12-49ea-a04d-a9ab766a187f-images\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.784407 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.785557 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.793306 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a1ba28-4b12-49ea-a04d-a9ab766a187f-proxy-tls\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.806350 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.827201 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.845774 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.865672 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.893182 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.901915 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.905449 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.912583 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.919192 4956 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.919306 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume podName:140b2851-bf05-4ec3-87db-657aacefdbd4 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:11.419275646 +0000 UTC m=+141.746396241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume") pod "collect-profiles-29320170-h7hht" (UID: "140b2851-bf05-4ec3-87db-657aacefdbd4") : failed to sync configmap cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.919623 4956 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.919842 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/106d5b0b-b745-4129-b766-ea2ab60c3569-config podName:106d5b0b-b745-4129-b766-ea2ab60c3569 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:11.419812865 +0000 UTC m=+141.746933500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/106d5b0b-b745-4129-b766-ea2ab60c3569-config") pod "service-ca-operator-777779d784-6hhrb" (UID: "106d5b0b-b745-4129-b766-ea2ab60c3569") : failed to sync configmap cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.919635 4956 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.919663 4956 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.919671 4956 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.920136 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-srv-cert podName:983d58af-8688-438e-8f8c-c658d1a9bdac nodeName:}" failed. No retries permitted until 2025-09-30 05:31:11.420099805 +0000 UTC m=+141.747220450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-srv-cert") pod "olm-operator-6b444d44fb-6mbsg" (UID: "983d58af-8688-438e-8f8c-c658d1a9bdac") : failed to sync secret cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.920276 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/106d5b0b-b745-4129-b766-ea2ab60c3569-serving-cert podName:106d5b0b-b745-4129-b766-ea2ab60c3569 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:11.42025179 +0000 UTC m=+141.747372365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/106d5b0b-b745-4129-b766-ea2ab60c3569-serving-cert") pod "service-ca-operator-777779d784-6hhrb" (UID: "106d5b0b-b745-4129-b766-ea2ab60c3569") : failed to sync secret cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: E0930 05:31:10.920306 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa301454-b9a6-4bed-acd1-f2cf109b5259-control-plane-machine-set-operator-tls podName:aa301454-b9a6-4bed-acd1-f2cf109b5259 nodeName:}" failed. No retries permitted until 2025-09-30 05:31:11.420291181 +0000 UTC m=+141.747411746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/aa301454-b9a6-4bed-acd1-f2cf109b5259-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-pclsc" (UID: "aa301454-b9a6-4bed-acd1-f2cf109b5259") : failed to sync secret cache: timed out waiting for the condition Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.925847 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.945887 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.965806 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 05:31:10 crc kubenswrapper[4956]: I0930 05:31:10.992724 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.006021 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.025110 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.046755 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.065980 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.086159 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.105993 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.126645 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.146215 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.185679 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.205409 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.226509 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.246102 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.266032 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.286279 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.306209 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.326549 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.347132 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.367370 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.386797 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.406649 4956 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.426597 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.444861 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106d5b0b-b745-4129-b766-ea2ab60c3569-config\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.444956 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106d5b0b-b745-4129-b766-ea2ab60c3569-serving-cert\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.445073 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-srv-cert\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.445150 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa301454-b9a6-4bed-acd1-f2cf109b5259-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pclsc\" (UID: \"aa301454-b9a6-4bed-acd1-f2cf109b5259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.445212 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.445543 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.445690 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106d5b0b-b745-4129-b766-ea2ab60c3569-config\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.447887 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.448633 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/983d58af-8688-438e-8f8c-c658d1a9bdac-srv-cert\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.448703 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa301454-b9a6-4bed-acd1-f2cf109b5259-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pclsc\" (UID: \"aa301454-b9a6-4bed-acd1-f2cf109b5259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.448783 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106d5b0b-b745-4129-b766-ea2ab60c3569-serving-cert\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.480178 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wc4v\" (UniqueName: \"kubernetes.io/projected/2ad2e251-9549-4f18-8467-8dc98924bc23-kube-api-access-9wc4v\") pod \"openshift-config-operator-7777fb866f-s62b4\" (UID: \"2ad2e251-9549-4f18-8467-8dc98924bc23\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.502072 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw4km\" (UniqueName: \"kubernetes.io/projected/90e0bca8-7e63-438d-95ca-7b855c885655-kube-api-access-bw4km\") pod \"apiserver-76f77b778f-mrbck\" (UID: \"90e0bca8-7e63-438d-95ca-7b855c885655\") " pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.526663 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.530726 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txmfc\" (UniqueName: \"kubernetes.io/projected/f91889a1-69c9-48ae-a920-00b05adab7e5-kube-api-access-txmfc\") pod \"authentication-operator-69f744f599-2kcxx\" (UID: \"f91889a1-69c9-48ae-a920-00b05adab7e5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.546497 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.586222 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.591823 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btlxj\" (UniqueName: \"kubernetes.io/projected/d3c797d8-62e0-4ed4-b376-ac0aba58185e-kube-api-access-btlxj\") pod \"machine-approver-56656f9798-442db\" (UID: \"d3c797d8-62e0-4ed4-b376-ac0aba58185e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.606345 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.641083 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h78br\" (UniqueName: \"kubernetes.io/projected/ebad915a-95b0-4596-8cbb-3d4acc6eb32a-kube-api-access-h78br\") pod \"console-operator-58897d9998-ztg8w\" (UID: \"ebad915a-95b0-4596-8cbb-3d4acc6eb32a\") " pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.661745 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2fm\" (UniqueName: \"kubernetes.io/projected/15065912-8d28-47dd-adaf-f7aedc754526-kube-api-access-kr2fm\") pod \"dns-operator-744455d44c-mwj5h\" (UID: \"15065912-8d28-47dd-adaf-f7aedc754526\") " pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.667863 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.677083 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.679293 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8vpq\" (UniqueName: \"kubernetes.io/projected/050f9843-c228-4681-96e9-8649f7eff6fc-kube-api-access-k8vpq\") pod \"route-controller-manager-6576b87f9c-5khh2\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.700275 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l997k\" (UniqueName: \"kubernetes.io/projected/39bae241-73b1-4078-9861-309c762b38b5-kube-api-access-l997k\") pod \"controller-manager-879f6c89f-x9c8r\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.700993 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" Sep 30 05:31:11 crc kubenswrapper[4956]: W0930 05:31:11.713622 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c797d8_62e0_4ed4_b376_ac0aba58185e.slice/crio-77bbf2a682e6e080efb047ee7a49e81ab2127c4d449062133c76c15b529d3970 WatchSource:0}: Error finding container 77bbf2a682e6e080efb047ee7a49e81ab2127c4d449062133c76c15b529d3970: Status 404 returned error can't find the container with id 77bbf2a682e6e080efb047ee7a49e81ab2127c4d449062133c76c15b529d3970 Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.722727 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8sd5\" (UniqueName: \"kubernetes.io/projected/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-kube-api-access-w8sd5\") pod \"console-f9d7485db-rtd48\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.724750 4956 request.go:700] Waited for 1.909495995s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.726457 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.738738 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhj9x\" (UniqueName: \"kubernetes.io/projected/d43f9c60-0360-4486-9c31-ecebf460d114-kube-api-access-vhj9x\") pod \"apiserver-7bbb656c7d-chdm2\" (UID: \"d43f9c60-0360-4486-9c31-ecebf460d114\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.756837 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.757975 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.774204 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.778020 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szmpz\" (UniqueName: \"kubernetes.io/projected/9ee01037-07c8-471e-860f-801980e351f8-kube-api-access-szmpz\") pod \"etcd-operator-b45778765-zhkrt\" (UID: \"9ee01037-07c8-471e-860f-801980e351f8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.782951 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x8bg\" (UniqueName: \"kubernetes.io/projected/9ec8ec3c-f0d3-41b1-a311-2eca015cd63a-kube-api-access-6x8bg\") pod \"machine-api-operator-5694c8668f-wjwhf\" (UID: \"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.802557 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njg99\" (UniqueName: \"kubernetes.io/projected/f29793e9-ea42-4820-8c66-a8f861e0370a-kube-api-access-njg99\") pod \"openshift-controller-manager-operator-756b6f6bc6-gfgqv\" (UID: \"f29793e9-ea42-4820-8c66-a8f861e0370a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.804353 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.805578 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.810836 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.822794 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.831507 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.833548 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.841342 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.846653 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.852758 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.894007 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l87pb\" (UID: \"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.914544 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghzpt\" (UniqueName: \"kubernetes.io/projected/c0770b87-e3c3-49d5-a290-3a3cd64367a7-kube-api-access-ghzpt\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.926408 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwr5\" (UniqueName: \"kubernetes.io/projected/36e17eaf-9884-4aa0-bd1e-0768c2e592f9-kube-api-access-tqwr5\") pod \"openshift-apiserver-operator-796bbdcf4f-zhmb6\" (UID: \"36e17eaf-9884-4aa0-bd1e-0768c2e592f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.944518 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8r2d\" (UniqueName: \"kubernetes.io/projected/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-kube-api-access-z8r2d\") pod \"marketplace-operator-79b997595-rvfx8\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.960777 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hk4n\" (UniqueName: \"kubernetes.io/projected/aa301454-b9a6-4bed-acd1-f2cf109b5259-kube-api-access-2hk4n\") pod \"control-plane-machine-set-operator-78cbb6b69f-pclsc\" (UID: \"aa301454-b9a6-4bed-acd1-f2cf109b5259\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.985268 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0770b87-e3c3-49d5-a290-3a3cd64367a7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5j47n\" (UID: \"c0770b87-e3c3-49d5-a290-3a3cd64367a7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.992769 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" event={"ID":"d3c797d8-62e0-4ed4-b376-ac0aba58185e","Type":"ContainerStarted","Data":"77bbf2a682e6e080efb047ee7a49e81ab2127c4d449062133c76c15b529d3970"} Sep 30 05:31:11 crc kubenswrapper[4956]: I0930 05:31:11.999675 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjnd\" (UniqueName: \"kubernetes.io/projected/bcb2fabd-b2e0-41f3-8677-e55602876ca9-kube-api-access-psjnd\") pod \"multus-admission-controller-857f4d67dd-t5shm\" (UID: \"bcb2fabd-b2e0-41f3-8677-e55602876ca9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.031180 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj5zd\" (UniqueName: \"kubernetes.io/projected/983d58af-8688-438e-8f8c-c658d1a9bdac-kube-api-access-fj5zd\") pod \"olm-operator-6b444d44fb-6mbsg\" (UID: \"983d58af-8688-438e-8f8c-c658d1a9bdac\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.040270 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.042946 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlb7v\" (UniqueName: \"kubernetes.io/projected/3a4863df-527f-4038-be9f-286437603a44-kube-api-access-vlb7v\") pod \"migrator-59844c95c7-5v6z6\" (UID: \"3a4863df-527f-4038-be9f-286437603a44\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.058752 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.067871 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.068870 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdsn\" (UniqueName: \"kubernetes.io/projected/2774c9ff-8b7d-4937-a6bb-86746bd4829f-kube-api-access-9wdsn\") pod \"service-ca-9c57cc56f-44xzl\" (UID: \"2774c9ff-8b7d-4937-a6bb-86746bd4829f\") " pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.085315 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzkk\" (UniqueName: \"kubernetes.io/projected/106d5b0b-b745-4129-b766-ea2ab60c3569-kube-api-access-qxzkk\") pod \"service-ca-operator-777779d784-6hhrb\" (UID: \"106d5b0b-b745-4129-b766-ea2ab60c3569\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.100730 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtlz5\" (UniqueName: \"kubernetes.io/projected/140b2851-bf05-4ec3-87db-657aacefdbd4-kube-api-access-jtlz5\") pod \"collect-profiles-29320170-h7hht\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.123524 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85jdx\" (UniqueName: \"kubernetes.io/projected/56a1ba28-4b12-49ea-a04d-a9ab766a187f-kube-api-access-85jdx\") pod \"machine-config-operator-74547568cd-8rd8z\" (UID: \"56a1ba28-4b12-49ea-a04d-a9ab766a187f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.145768 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjt74\" (UniqueName: \"kubernetes.io/projected/1e8969ec-fae5-48f6-afe3-f230deb5f802-kube-api-access-mjt74\") pod \"package-server-manager-789f6589d5-j9dnt\" (UID: \"1e8969ec-fae5-48f6-afe3-f230deb5f802\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.172212 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee7ae876-4694-4cca-92e9-80528dc98bb4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-42782\" (UID: \"ee7ae876-4694-4cca-92e9-80528dc98bb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.182752 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.188836 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.190254 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s62b4"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.190460 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mrbck"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.201637 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259650 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55234365-65bf-4faa-9f74-dd50194f062e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259694 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-profile-collector-cert\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259737 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259766 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259823 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-trusted-ca\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259842 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmskj\" (UniqueName: \"kubernetes.io/projected/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-kube-api-access-fmskj\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259867 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259904 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-tls\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259926 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-webhook-cert\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259945 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259962 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25b00db7-971e-4eb8-8303-e6020a9ebc3c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.259980 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9rbr\" (UniqueName: \"kubernetes.io/projected/25b00db7-971e-4eb8-8303-e6020a9ebc3c-kube-api-access-n9rbr\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260007 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260025 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z77w\" (UniqueName: \"kubernetes.io/projected/e1b8e355-5b80-4df5-9742-ccf35b26d474-kube-api-access-9z77w\") pod \"cluster-samples-operator-665b6dd947-s8rtf\" (UID: \"e1b8e355-5b80-4df5-9742-ccf35b26d474\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260055 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98c27d5f-42a5-4c1b-b5f4-49dcef583537-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260074 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjkld\" (UniqueName: \"kubernetes.io/projected/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-kube-api-access-jjkld\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260091 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55234365-65bf-4faa-9f74-dd50194f062e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260106 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-default-certificate\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260164 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-policies\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260194 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260212 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25b00db7-971e-4eb8-8303-e6020a9ebc3c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260248 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25b00db7-971e-4eb8-8303-e6020a9ebc3c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260282 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-certificates\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260303 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55234365-65bf-4faa-9f74-dd50194f062e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260322 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-dir\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260339 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-metrics-certs\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260408 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260466 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-tmpfs\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260498 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260515 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n568g\" (UniqueName: \"kubernetes.io/projected/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-kube-api-access-n568g\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260534 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260596 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-bound-sa-token\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260622 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260648 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkwbs\" (UniqueName: \"kubernetes.io/projected/4608fdfb-f63a-4992-889e-3bbf043257b6-kube-api-access-gkwbs\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260663 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-srv-cert\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260680 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1b8e355-5b80-4df5-9742-ccf35b26d474-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s8rtf\" (UID: \"e1b8e355-5b80-4df5-9742-ccf35b26d474\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260711 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-service-ca-bundle\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260738 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm6bz\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-kube-api-access-dm6bz\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260760 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260792 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260818 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98c27d5f-42a5-4c1b-b5f4-49dcef583537-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260897 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.260924 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-stats-auth\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: E0930 05:31:12.262714 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:12.762702564 +0000 UTC m=+143.089823089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.265240 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.272236 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.289894 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rtd48"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.293872 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.294312 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.301759 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2kcxx"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.303082 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9c8r"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.314070 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.323353 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.346650 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.350747 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.361828 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362023 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55234365-65bf-4faa-9f74-dd50194f062e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362317 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60e6248e-c328-4964-a551-34f6ab016589-metrics-tls\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362357 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-dir\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362392 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-metrics-certs\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362408 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362428 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362459 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-tmpfs\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362498 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362527 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n568g\" (UniqueName: \"kubernetes.io/projected/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-kube-api-access-n568g\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362555 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362570 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-node-bootstrap-token\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362586 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-certs\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362630 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-bound-sa-token\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362669 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362687 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkwbs\" (UniqueName: \"kubernetes.io/projected/4608fdfb-f63a-4992-889e-3bbf043257b6-kube-api-access-gkwbs\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362702 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-srv-cert\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362730 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1b8e355-5b80-4df5-9742-ccf35b26d474-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s8rtf\" (UID: \"e1b8e355-5b80-4df5-9742-ccf35b26d474\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362795 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-service-ca-bundle\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362813 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362838 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm6bz\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-kube-api-access-dm6bz\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362867 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98c27d5f-42a5-4c1b-b5f4-49dcef583537-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362884 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-plugins-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362910 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362926 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-csi-data-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362944 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-stats-auth\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.362993 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmm8\" (UniqueName: \"kubernetes.io/projected/5027231b-70a0-4243-aa58-2e2f14dbba62-kube-api-access-4zmm8\") pod \"ingress-canary-l45c8\" (UID: \"5027231b-70a0-4243-aa58-2e2f14dbba62\") " pod="openshift-ingress-canary/ingress-canary-l45c8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363016 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb92t\" (UniqueName: \"kubernetes.io/projected/91203589-4d13-4929-8bd1-28f8a40e2b44-kube-api-access-nb92t\") pod \"downloads-7954f5f757-v25lf\" (UID: \"91203589-4d13-4929-8bd1-28f8a40e2b44\") " pod="openshift-console/downloads-7954f5f757-v25lf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363051 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55234365-65bf-4faa-9f74-dd50194f062e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363070 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh9g4\" (UniqueName: \"kubernetes.io/projected/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-kube-api-access-gh9g4\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363130 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-profile-collector-cert\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363148 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60e6248e-c328-4964-a551-34f6ab016589-config-volume\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363237 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363280 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-proxy-tls\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363296 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27pxk\" (UniqueName: \"kubernetes.io/projected/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-kube-api-access-27pxk\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363313 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363329 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363384 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-socket-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363402 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmskj\" (UniqueName: \"kubernetes.io/projected/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-kube-api-access-fmskj\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363453 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363514 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-trusted-ca\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363545 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5027231b-70a0-4243-aa58-2e2f14dbba62-cert\") pod \"ingress-canary-l45c8\" (UID: \"5027231b-70a0-4243-aa58-2e2f14dbba62\") " pod="openshift-ingress-canary/ingress-canary-l45c8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363609 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363660 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml4gt\" (UniqueName: \"kubernetes.io/projected/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-kube-api-access-ml4gt\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363695 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-webhook-cert\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363714 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzws\" (UniqueName: \"kubernetes.io/projected/8fb391e0-dd9a-447f-af99-8ec70c6221bd-kube-api-access-ltzws\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363746 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-tls\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363767 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9rbr\" (UniqueName: \"kubernetes.io/projected/25b00db7-971e-4eb8-8303-e6020a9ebc3c-kube-api-access-n9rbr\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363800 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363818 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25b00db7-971e-4eb8-8303-e6020a9ebc3c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363836 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-mountpoint-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363854 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363902 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z77w\" (UniqueName: \"kubernetes.io/projected/e1b8e355-5b80-4df5-9742-ccf35b26d474-kube-api-access-9z77w\") pod \"cluster-samples-operator-665b6dd947-s8rtf\" (UID: \"e1b8e355-5b80-4df5-9742-ccf35b26d474\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363932 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98c27d5f-42a5-4c1b-b5f4-49dcef583537-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363952 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjkld\" (UniqueName: \"kubernetes.io/projected/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-kube-api-access-jjkld\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363970 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4r2j\" (UniqueName: \"kubernetes.io/projected/60e6248e-c328-4964-a551-34f6ab016589-kube-api-access-l4r2j\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.363999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-policies\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.364016 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55234365-65bf-4faa-9f74-dd50194f062e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.364035 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-default-certificate\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.364053 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.364068 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25b00db7-971e-4eb8-8303-e6020a9ebc3c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.364093 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25b00db7-971e-4eb8-8303-e6020a9ebc3c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.364141 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-registration-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.364192 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-certificates\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.364521 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-dir\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: E0930 05:31:12.365321 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:12.865306917 +0000 UTC m=+143.192427432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.367592 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55234365-65bf-4faa-9f74-dd50194f062e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.371080 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.371205 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-tls\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.372488 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-tmpfs\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.372634 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55234365-65bf-4faa-9f74-dd50194f062e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.372790 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98c27d5f-42a5-4c1b-b5f4-49dcef583537-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.373195 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.376915 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.378420 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-webhook-cert\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.378693 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-policies\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.378903 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-service-ca-bundle\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.378969 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-metrics-certs\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.379527 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-srv-cert\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.379753 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.380434 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.380638 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-certificates\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.380952 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-stats-auth\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.381149 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.381322 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25b00db7-971e-4eb8-8303-e6020a9ebc3c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.381325 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98c27d5f-42a5-4c1b-b5f4-49dcef583537-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.381492 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.381517 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.382675 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25b00db7-971e-4eb8-8303-e6020a9ebc3c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.383197 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.383254 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-trusted-ca\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.384051 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.385838 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1b8e355-5b80-4df5-9742-ccf35b26d474-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s8rtf\" (UID: \"e1b8e355-5b80-4df5-9742-ccf35b26d474\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.386614 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-profile-collector-cert\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.391988 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-default-certificate\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.394848 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.395925 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.397470 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.401600 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wjwhf"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.402253 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mwj5h"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.406784 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n568g\" (UniqueName: \"kubernetes.io/projected/dfbe88f3-0f36-4ac5-9d56-f80d07af868a-kube-api-access-n568g\") pod \"catalog-operator-68c6474976-j7mgz\" (UID: \"dfbe88f3-0f36-4ac5-9d56-f80d07af868a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.427017 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.428206 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ztg8w"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.440546 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-bound-sa-token\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.442682 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmskj\" (UniqueName: \"kubernetes.io/projected/fa42e960-679c-45f3-ae65-5ab2ebbc81e5-kube-api-access-fmskj\") pod \"packageserver-d55dfcdfc-tzwdp\" (UID: \"fa42e960-679c-45f3-ae65-5ab2ebbc81e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468291 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60e6248e-c328-4964-a551-34f6ab016589-metrics-tls\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468344 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468380 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-node-bootstrap-token\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468410 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-certs\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468469 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468492 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-plugins-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468511 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-csi-data-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468536 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmm8\" (UniqueName: \"kubernetes.io/projected/5027231b-70a0-4243-aa58-2e2f14dbba62-kube-api-access-4zmm8\") pod \"ingress-canary-l45c8\" (UID: \"5027231b-70a0-4243-aa58-2e2f14dbba62\") " pod="openshift-ingress-canary/ingress-canary-l45c8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468556 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb92t\" (UniqueName: \"kubernetes.io/projected/91203589-4d13-4929-8bd1-28f8a40e2b44-kube-api-access-nb92t\") pod \"downloads-7954f5f757-v25lf\" (UID: \"91203589-4d13-4929-8bd1-28f8a40e2b44\") " pod="openshift-console/downloads-7954f5f757-v25lf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468580 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh9g4\" (UniqueName: \"kubernetes.io/projected/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-kube-api-access-gh9g4\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468600 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60e6248e-c328-4964-a551-34f6ab016589-config-volume\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468622 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468644 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-proxy-tls\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468662 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27pxk\" (UniqueName: \"kubernetes.io/projected/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-kube-api-access-27pxk\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468684 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-socket-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468704 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468722 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5027231b-70a0-4243-aa58-2e2f14dbba62-cert\") pod \"ingress-canary-l45c8\" (UID: \"5027231b-70a0-4243-aa58-2e2f14dbba62\") " pod="openshift-ingress-canary/ingress-canary-l45c8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468743 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml4gt\" (UniqueName: \"kubernetes.io/projected/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-kube-api-access-ml4gt\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468768 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzws\" (UniqueName: \"kubernetes.io/projected/8fb391e0-dd9a-447f-af99-8ec70c6221bd-kube-api-access-ltzws\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468805 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-mountpoint-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468846 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4r2j\" (UniqueName: \"kubernetes.io/projected/60e6248e-c328-4964-a551-34f6ab016589-kube-api-access-l4r2j\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.468875 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-registration-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.469161 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-registration-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.469901 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.470443 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm6bz\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-kube-api-access-dm6bz\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.470889 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60e6248e-c328-4964-a551-34f6ab016589-config-volume\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.470986 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-plugins-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.473497 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-mountpoint-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.474695 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-csi-data-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.474730 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8fb391e0-dd9a-447f-af99-8ec70c6221bd-socket-dir\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: E0930 05:31:12.474918 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:12.974907629 +0000 UTC m=+143.302028154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.477781 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.478625 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5027231b-70a0-4243-aa58-2e2f14dbba62-cert\") pod \"ingress-canary-l45c8\" (UID: \"5027231b-70a0-4243-aa58-2e2f14dbba62\") " pod="openshift-ingress-canary/ingress-canary-l45c8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.482624 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-proxy-tls\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.485793 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9rbr\" (UniqueName: \"kubernetes.io/projected/25b00db7-971e-4eb8-8303-e6020a9ebc3c-kube-api-access-n9rbr\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.487250 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-certs\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.487830 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-node-bootstrap-token\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.489738 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.494809 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60e6248e-c328-4964-a551-34f6ab016589-metrics-tls\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.495478 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zhkrt"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.498252 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.498295 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.507662 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkwbs\" (UniqueName: \"kubernetes.io/projected/4608fdfb-f63a-4992-889e-3bbf043257b6-kube-api-access-gkwbs\") pod \"oauth-openshift-558db77b4-4fn5g\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.531974 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25b00db7-971e-4eb8-8303-e6020a9ebc3c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gbs8\" (UID: \"25b00db7-971e-4eb8-8303-e6020a9ebc3c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.563840 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z77w\" (UniqueName: \"kubernetes.io/projected/e1b8e355-5b80-4df5-9742-ccf35b26d474-kube-api-access-9z77w\") pod \"cluster-samples-operator-665b6dd947-s8rtf\" (UID: \"e1b8e355-5b80-4df5-9742-ccf35b26d474\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.565503 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjkld\" (UniqueName: \"kubernetes.io/projected/7aa8b675-10f2-4d8e-8e3a-79359f16d7bc-kube-api-access-jjkld\") pod \"router-default-5444994796-xg6dj\" (UID: \"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc\") " pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.572077 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:12 crc kubenswrapper[4956]: E0930 05:31:12.573210 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.073186715 +0000 UTC m=+143.400307240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.576544 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:12 crc kubenswrapper[4956]: W0930 05:31:12.582327 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa301454_b9a6_4bed_acd1_f2cf109b5259.slice/crio-7551d8e4c112ef831ade4fccfe9ffe2707aff1a3e1096c2e091b188c0e9cc120 WatchSource:0}: Error finding container 7551d8e4c112ef831ade4fccfe9ffe2707aff1a3e1096c2e091b188c0e9cc120: Status 404 returned error can't find the container with id 7551d8e4c112ef831ade4fccfe9ffe2707aff1a3e1096c2e091b188c0e9cc120 Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.598738 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvfx8"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.600127 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.603685 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55234365-65bf-4faa-9f74-dd50194f062e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sgr97\" (UID: \"55234365-65bf-4faa-9f74-dd50194f062e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.625061 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.627048 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.628725 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzws\" (UniqueName: \"kubernetes.io/projected/8fb391e0-dd9a-447f-af99-8ec70c6221bd-kube-api-access-ltzws\") pod \"csi-hostpathplugin-69hd4\" (UID: \"8fb391e0-dd9a-447f-af99-8ec70c6221bd\") " pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.645824 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml4gt\" (UniqueName: \"kubernetes.io/projected/1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03-kube-api-access-ml4gt\") pod \"machine-config-server-8658c\" (UID: \"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03\") " pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.662612 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4r2j\" (UniqueName: \"kubernetes.io/projected/60e6248e-c328-4964-a551-34f6ab016589-kube-api-access-l4r2j\") pod \"dns-default-w22x5\" (UID: \"60e6248e-c328-4964-a551-34f6ab016589\") " pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.678580 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.679083 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.679476 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb"] Sep 30 05:31:12 crc kubenswrapper[4956]: E0930 05:31:12.679572 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.179559336 +0000 UTC m=+143.506679861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.682830 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27pxk\" (UniqueName: \"kubernetes.io/projected/7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6-kube-api-access-27pxk\") pod \"machine-config-controller-84d6567774-rmc7z\" (UID: \"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.688796 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.694066 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8658c" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.694991 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.700870 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb92t\" (UniqueName: \"kubernetes.io/projected/91203589-4d13-4929-8bd1-28f8a40e2b44-kube-api-access-nb92t\") pod \"downloads-7954f5f757-v25lf\" (UID: \"91203589-4d13-4929-8bd1-28f8a40e2b44\") " pod="openshift-console/downloads-7954f5f757-v25lf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.725652 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-69hd4" Sep 30 05:31:12 crc kubenswrapper[4956]: W0930 05:31:12.731444 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00da0261_ae3c_40c6_a3a3_b7f8aaf3aff7.slice/crio-a64c1edf39ad42c1febea778c67089ac1d693d56108115e7f6255fc0d5cbace4 WatchSource:0}: Error finding container a64c1edf39ad42c1febea778c67089ac1d693d56108115e7f6255fc0d5cbace4: Status 404 returned error can't find the container with id a64c1edf39ad42c1febea778c67089ac1d693d56108115e7f6255fc0d5cbace4 Sep 30 05:31:12 crc kubenswrapper[4956]: W0930 05:31:12.731720 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e17eaf_9884_4aa0_bd1e_0768c2e592f9.slice/crio-28bb179244e0c9bb5c82b0644388d96e474e864574b4a2277fd0d99e163318ae WatchSource:0}: Error finding container 28bb179244e0c9bb5c82b0644388d96e474e864574b4a2277fd0d99e163318ae: Status 404 returned error can't find the container with id 28bb179244e0c9bb5c82b0644388d96e474e864574b4a2277fd0d99e163318ae Sep 30 05:31:12 crc kubenswrapper[4956]: W0930 05:31:12.731848 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0770b87_e3c3_49d5_a290_3a3cd64367a7.slice/crio-8968b9f6182e87cb5d7f5e68871f9a85201ac2f0d44049b422dbc217471870dd WatchSource:0}: Error finding container 8968b9f6182e87cb5d7f5e68871f9a85201ac2f0d44049b422dbc217471870dd: Status 404 returned error can't find the container with id 8968b9f6182e87cb5d7f5e68871f9a85201ac2f0d44049b422dbc217471870dd Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.732851 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmm8\" (UniqueName: \"kubernetes.io/projected/5027231b-70a0-4243-aa58-2e2f14dbba62-kube-api-access-4zmm8\") pod \"ingress-canary-l45c8\" (UID: \"5027231b-70a0-4243-aa58-2e2f14dbba62\") " pod="openshift-ingress-canary/ingress-canary-l45c8" Sep 30 05:31:12 crc kubenswrapper[4956]: W0930 05:31:12.733760 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod983d58af_8688_438e_8f8c_c658d1a9bdac.slice/crio-6fb8c8c40c44b9b576446e58d95e123bf098d270ed1256920dafce07700fa8d8 WatchSource:0}: Error finding container 6fb8c8c40c44b9b576446e58d95e123bf098d270ed1256920dafce07700fa8d8: Status 404 returned error can't find the container with id 6fb8c8c40c44b9b576446e58d95e123bf098d270ed1256920dafce07700fa8d8 Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.752506 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.757977 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh9g4\" (UniqueName: \"kubernetes.io/projected/fab6b2a0-7e69-4446-88fa-094cf0ebcf12-kube-api-access-gh9g4\") pod \"kube-storage-version-migrator-operator-b67b599dd-9hbc6\" (UID: \"fab6b2a0-7e69-4446-88fa-094cf0ebcf12\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.772005 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v25lf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.787595 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:12 crc kubenswrapper[4956]: E0930 05:31:12.787943 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.287927477 +0000 UTC m=+143.615048002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.797226 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.807445 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.847632 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.854416 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.860636 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.889271 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:12 crc kubenswrapper[4956]: E0930 05:31:12.889666 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.389627029 +0000 UTC m=+143.716747554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.910084 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.911180 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.921935 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-t5shm"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.974438 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.984425 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt"] Sep 30 05:31:12 crc kubenswrapper[4956]: I0930 05:31:12.990510 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:12 crc kubenswrapper[4956]: E0930 05:31:12.991102 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.491040201 +0000 UTC m=+143.818160716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.008669 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.009796 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" event={"ID":"15065912-8d28-47dd-adaf-f7aedc754526","Type":"ContainerStarted","Data":"faedfd46fe59af45b66901961724b5726abc0df26030051e969559ebaa75a463"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.011289 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-44xzl"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.023027 4956 generic.go:334] "Generic (PLEG): container finished" podID="90e0bca8-7e63-438d-95ca-7b855c885655" containerID="3e0c8d36a0783c976d49b47d7366f7b7927fcf9205bda70acc52ef40157015bb" exitCode=0 Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.023155 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" event={"ID":"90e0bca8-7e63-438d-95ca-7b855c885655","Type":"ContainerDied","Data":"3e0c8d36a0783c976d49b47d7366f7b7927fcf9205bda70acc52ef40157015bb"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.023184 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" event={"ID":"90e0bca8-7e63-438d-95ca-7b855c885655","Type":"ContainerStarted","Data":"bd9dd27d15f6c0595d273f667a45cef35ea0facb82f85c3c20b694e0b4792b14"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.024699 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.032760 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l45c8" Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.037181 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" event={"ID":"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7","Type":"ContainerStarted","Data":"a64c1edf39ad42c1febea778c67089ac1d693d56108115e7f6255fc0d5cbace4"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.075430 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" event={"ID":"bcb2fabd-b2e0-41f3-8677-e55602876ca9","Type":"ContainerStarted","Data":"66547e25df02dae245639837c505fb909e8455a6aa253e57c69f26f5f7912577"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.077345 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.083671 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.092088 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.092452 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.592439373 +0000 UTC m=+143.919559898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.096176 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.106184 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" event={"ID":"c0770b87-e3c3-49d5-a290-3a3cd64367a7","Type":"ContainerStarted","Data":"8968b9f6182e87cb5d7f5e68871f9a85201ac2f0d44049b422dbc217471870dd"} Sep 30 05:31:13 crc kubenswrapper[4956]: W0930 05:31:13.106311 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod106d5b0b_b745_4129_b766_ea2ab60c3569.slice/crio-26dec63f1f47065229f0f4a9f0f145a1a152c3db693e85164e9360f9478bd100 WatchSource:0}: Error finding container 26dec63f1f47065229f0f4a9f0f145a1a152c3db693e85164e9360f9478bd100: Status 404 returned error can't find the container with id 26dec63f1f47065229f0f4a9f0f145a1a152c3db693e85164e9360f9478bd100 Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.110448 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" event={"ID":"9ee01037-07c8-471e-860f-801980e351f8","Type":"ContainerStarted","Data":"c491e48a6903dc1afe3e3d342a1f4e4e9dd24682cd8aca5d80b7b7ff06feb426"} Sep 30 05:31:13 crc kubenswrapper[4956]: W0930 05:31:13.112440 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a4863df_527f_4038_be9f_286437603a44.slice/crio-ffb203405ac34477e50e071c397a3892a4ab0bcf83a29894b9da1c7237d63281 WatchSource:0}: Error finding container ffb203405ac34477e50e071c397a3892a4ab0bcf83a29894b9da1c7237d63281: Status 404 returned error can't find the container with id ffb203405ac34477e50e071c397a3892a4ab0bcf83a29894b9da1c7237d63281 Sep 30 05:31:13 crc kubenswrapper[4956]: W0930 05:31:13.125194 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2774c9ff_8b7d_4937_a6bb_86746bd4829f.slice/crio-420c0522134a709321df5284a41dfdabc76f7a66a3b03304e828ded3e4ff070e WatchSource:0}: Error finding container 420c0522134a709321df5284a41dfdabc76f7a66a3b03304e828ded3e4ff070e: Status 404 returned error can't find the container with id 420c0522134a709321df5284a41dfdabc76f7a66a3b03304e828ded3e4ff070e Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.131029 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" event={"ID":"d3c797d8-62e0-4ed4-b376-ac0aba58185e","Type":"ContainerStarted","Data":"c8ccb4b0dc83bdc04d4681509662f4de15ec2ade5e45b8718ac2db19b599a2aa"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.131073 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" event={"ID":"d3c797d8-62e0-4ed4-b376-ac0aba58185e","Type":"ContainerStarted","Data":"b6c73f5406bc149adeda65b5b42a49b239562c66a44cecd1fe43d571e393760f"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.163867 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" event={"ID":"36e17eaf-9884-4aa0-bd1e-0768c2e592f9","Type":"ContainerStarted","Data":"28bb179244e0c9bb5c82b0644388d96e474e864574b4a2277fd0d99e163318ae"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.169009 4956 generic.go:334] "Generic (PLEG): container finished" podID="2ad2e251-9549-4f18-8467-8dc98924bc23" containerID="778a6b2a2b9153d5a226fe045cad0d8ce371f728ac6ff6c67170c642485eb66c" exitCode=0 Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.169171 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" event={"ID":"2ad2e251-9549-4f18-8467-8dc98924bc23","Type":"ContainerDied","Data":"778a6b2a2b9153d5a226fe045cad0d8ce371f728ac6ff6c67170c642485eb66c"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.169225 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" event={"ID":"2ad2e251-9549-4f18-8467-8dc98924bc23","Type":"ContainerStarted","Data":"931a54a4846811d3143be5bb54294e04c66c1077e30759cd0da5dcd173cf02a6"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.183414 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" event={"ID":"aa301454-b9a6-4bed-acd1-f2cf109b5259","Type":"ContainerStarted","Data":"7551d8e4c112ef831ade4fccfe9ffe2707aff1a3e1096c2e091b188c0e9cc120"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.195125 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.195332 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.695306625 +0000 UTC m=+144.022427150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.195379 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.195966 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.695953678 +0000 UTC m=+144.023074203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.205180 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rtd48" event={"ID":"8f523f25-bc41-44dc-b311-bf6df1cbc2ee","Type":"ContainerStarted","Data":"14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.205224 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rtd48" event={"ID":"8f523f25-bc41-44dc-b311-bf6df1cbc2ee","Type":"ContainerStarted","Data":"b0596f4275d84fec44630e6175cfaad09c7d7dd80db8671b890dd42a0d3add18"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.220724 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8658c" event={"ID":"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03","Type":"ContainerStarted","Data":"e0b73708379dc4f8714afe202ede2ff5730e2217fade5b2696506d8a1fba7c12"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.223125 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" event={"ID":"d43f9c60-0360-4486-9c31-ecebf460d114","Type":"ContainerStarted","Data":"0712f7b71e579a40c5cce991ee7179d31e2ba024f2f2388ef67b689fc1c905ca"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.225708 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" event={"ID":"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a","Type":"ContainerStarted","Data":"ab314c2ba56d28da3b6d803c35c791f44b7492a451997ef814ac4dcb93bec33b"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.225735 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" event={"ID":"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a","Type":"ContainerStarted","Data":"d25c6982342a75c68d384f6f99fcc0c69f79a6872f0326e27ccfae6d8c2132d5"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.228455 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" event={"ID":"9bdc12c7-725c-4496-8e0e-e4ec3d911cca","Type":"ContainerStarted","Data":"89d23efbfd40682105b77559e4fe9a30342e2bae5051216e9e305e1e8ebadba3"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.230381 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" event={"ID":"ebad915a-95b0-4596-8cbb-3d4acc6eb32a","Type":"ContainerStarted","Data":"289d658c0144810a4e2139f87fcc05609f9ae9895f9c6911894ff75f3bce4ddd"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.230619 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.231692 4956 patch_prober.go:28] interesting pod/console-operator-58897d9998-ztg8w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.231727 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" podUID="ebad915a-95b0-4596-8cbb-3d4acc6eb32a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.236147 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" event={"ID":"f29793e9-ea42-4820-8c66-a8f861e0370a","Type":"ContainerStarted","Data":"27776b6f1162c2d3c7542893d0a1bf1d1d2d6bfb8c50dce96943f9c7598f4614"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.236180 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" event={"ID":"f29793e9-ea42-4820-8c66-a8f861e0370a","Type":"ContainerStarted","Data":"a9d3ef397fc7b89e265a3fc53aaf20b9e92b97eb46bc7349e44eb494cc5796da"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.248486 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" event={"ID":"ee7ae876-4694-4cca-92e9-80528dc98bb4","Type":"ContainerStarted","Data":"8615bc55199cab877ac44088641bd172d95b2dc549fa73a3e92332e1450764da"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.255807 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" event={"ID":"39bae241-73b1-4078-9861-309c762b38b5","Type":"ContainerStarted","Data":"cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.255873 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" event={"ID":"39bae241-73b1-4078-9861-309c762b38b5","Type":"ContainerStarted","Data":"49d9abbcf43428b3d6b7abc41c0dd46dd2ea1b48723520a0538356ed43dffbe0"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.255916 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.257963 4956 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x9c8r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.258013 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" podUID="39bae241-73b1-4078-9861-309c762b38b5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.261299 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" event={"ID":"983d58af-8688-438e-8f8c-c658d1a9bdac","Type":"ContainerStarted","Data":"6fb8c8c40c44b9b576446e58d95e123bf098d270ed1256920dafce07700fa8d8"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.296490 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.296851 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.796832272 +0000 UTC m=+144.123952797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.297156 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.299432 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.79942267 +0000 UTC m=+144.126543195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.317274 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" event={"ID":"050f9843-c228-4681-96e9-8649f7eff6fc","Type":"ContainerStarted","Data":"c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.317316 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" event={"ID":"050f9843-c228-4681-96e9-8649f7eff6fc","Type":"ContainerStarted","Data":"2db853daeb8aef784ed0a57451fa2635caecedb67f613c991dcbe66863168131"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.317896 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.322048 4956 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5khh2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.322084 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" podUID="050f9843-c228-4681-96e9-8649f7eff6fc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.323994 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" event={"ID":"f91889a1-69c9-48ae-a920-00b05adab7e5","Type":"ContainerStarted","Data":"971002a9a5fc0a369e17527a17670761f224e152d829bb6f4ff2f0b3e390b94f"} Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.324020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" event={"ID":"f91889a1-69c9-48ae-a920-00b05adab7e5","Type":"ContainerStarted","Data":"927deebd9ae30c19ce5a23ff9705134eeb18a222bfb4dd722b0ab08c1ec07d02"} Sep 30 05:31:13 crc kubenswrapper[4956]: W0930 05:31:13.367976 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa8b675_10f2_4d8e_8e3a_79359f16d7bc.slice/crio-f0e35fe5d40302a68970477542ff5cd90d79aa6c20e14cf5e3958d9f79eaf937 WatchSource:0}: Error finding container f0e35fe5d40302a68970477542ff5cd90d79aa6c20e14cf5e3958d9f79eaf937: Status 404 returned error can't find the container with id f0e35fe5d40302a68970477542ff5cd90d79aa6c20e14cf5e3958d9f79eaf937 Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.382703 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-69hd4"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.400099 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.403362 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:13.903342918 +0000 UTC m=+144.230463453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.450021 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w22x5"] Sep 30 05:31:13 crc kubenswrapper[4956]: W0930 05:31:13.460776 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb391e0_dd9a_447f_af99_8ec70c6221bd.slice/crio-df37cb21d4c6cd3cb3f6cfd2e696af887e5ddcdaa1be47dbc625189b3512d987 WatchSource:0}: Error finding container df37cb21d4c6cd3cb3f6cfd2e696af887e5ddcdaa1be47dbc625189b3512d987: Status 404 returned error can't find the container with id df37cb21d4c6cd3cb3f6cfd2e696af887e5ddcdaa1be47dbc625189b3512d987 Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.497189 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v25lf"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.508986 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.510159 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.010147975 +0000 UTC m=+144.337268500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.611269 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.615518 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.616087 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.616397 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.116383442 +0000 UTC m=+144.443503967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.708294 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.708344 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4fn5g"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.717920 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.718203 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.218191148 +0000 UTC m=+144.545311673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.720330 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6"] Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.818438 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.818764 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.31873948 +0000 UTC m=+144.645860005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:13 crc kubenswrapper[4956]: I0930 05:31:13.920424 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:13 crc kubenswrapper[4956]: E0930 05:31:13.920925 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.420914059 +0000 UTC m=+144.748034584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.021606 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.022397 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.522371363 +0000 UTC m=+144.849491888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.063739 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l45c8"] Sep 30 05:31:14 crc kubenswrapper[4956]: W0930 05:31:14.092658 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5027231b_70a0_4243_aa58_2e2f14dbba62.slice/crio-95eefb3eccfff69c32f7d5a8f5e9b6eac7eb39cfb63d93aa1e282ca125066f16 WatchSource:0}: Error finding container 95eefb3eccfff69c32f7d5a8f5e9b6eac7eb39cfb63d93aa1e282ca125066f16: Status 404 returned error can't find the container with id 95eefb3eccfff69c32f7d5a8f5e9b6eac7eb39cfb63d93aa1e282ca125066f16 Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.124658 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.126426 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.626409165 +0000 UTC m=+144.953529690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.226270 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.226822 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.726769571 +0000 UTC m=+145.053890106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.226911 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.227492 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.727484106 +0000 UTC m=+145.054604631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.330733 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.331367 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.831350182 +0000 UTC m=+145.158470707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.390653 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" event={"ID":"106d5b0b-b745-4129-b766-ea2ab60c3569","Type":"ContainerStarted","Data":"804abe6ff1dfa2b8ec96205286012eb1ada210cffdd671a159f40e1ad0d78b5e"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.390715 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" event={"ID":"106d5b0b-b745-4129-b766-ea2ab60c3569","Type":"ContainerStarted","Data":"26dec63f1f47065229f0f4a9f0f145a1a152c3db693e85164e9360f9478bd100"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.422355 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" event={"ID":"15065912-8d28-47dd-adaf-f7aedc754526","Type":"ContainerStarted","Data":"22252d15cb896448270fd0b6bfe59c1218b9cb8f50b8938b160c0e4b7d7473f6"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.427467 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" event={"ID":"140b2851-bf05-4ec3-87db-657aacefdbd4","Type":"ContainerStarted","Data":"a8158496e72dde154eee6af181c09d157c45adbc5671ed3ffee47dfc313fcd29"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.427511 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" event={"ID":"140b2851-bf05-4ec3-87db-657aacefdbd4","Type":"ContainerStarted","Data":"7d7487f28f9b8d155df2d31f952ae540a39467741c335ac6290b85127a6ae447"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.432445 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.432823 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:14.932811597 +0000 UTC m=+145.259932122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.452184 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" podStartSLOduration=124.452167447 podStartE2EDuration="2m4.452167447s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.451919229 +0000 UTC m=+144.779039754" watchObservedRunningTime="2025-09-30 05:31:14.452167447 +0000 UTC m=+144.779287972" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.453618 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rtd48" podStartSLOduration=124.453612566 podStartE2EDuration="2m4.453612566s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.406127796 +0000 UTC m=+144.733248321" watchObservedRunningTime="2025-09-30 05:31:14.453612566 +0000 UTC m=+144.780733091" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.482441 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-2kcxx" podStartSLOduration=124.48242208 podStartE2EDuration="2m4.48242208s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.48240615 +0000 UTC m=+144.809526685" watchObservedRunningTime="2025-09-30 05:31:14.48242208 +0000 UTC m=+144.809542605" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.514076 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" event={"ID":"90e0bca8-7e63-438d-95ca-7b855c885655","Type":"ContainerStarted","Data":"9a7d226fbd34c05e00c4d1320f43708339db3fad3bbe96bd371883a259df4845"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.518791 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gfgqv" podStartSLOduration=124.518778011 podStartE2EDuration="2m4.518778011s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.518515102 +0000 UTC m=+144.845635627" watchObservedRunningTime="2025-09-30 05:31:14.518778011 +0000 UTC m=+144.845898526" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.533720 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.534895 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.034875311 +0000 UTC m=+145.361995836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.546526 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" event={"ID":"9bdc12c7-725c-4496-8e0e-e4ec3d911cca","Type":"ContainerStarted","Data":"a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.547418 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.551266 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" event={"ID":"e1b8e355-5b80-4df5-9742-ccf35b26d474","Type":"ContainerStarted","Data":"2da5c27f0b1f70ad8210f8439fd76b6a91d167c4696f022d8f9d2c9aa559091d"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.561067 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69hd4" event={"ID":"8fb391e0-dd9a-447f-af99-8ec70c6221bd","Type":"ContainerStarted","Data":"df37cb21d4c6cd3cb3f6cfd2e696af887e5ddcdaa1be47dbc625189b3512d987"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.568369 4956 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rvfx8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.568413 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" podUID="9bdc12c7-725c-4496-8e0e-e4ec3d911cca" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.590037 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" event={"ID":"4608fdfb-f63a-4992-889e-3bbf043257b6","Type":"ContainerStarted","Data":"fd772c2879b206f528f3a6f36f5300d7fefe3b729a3fe64e9c8933e3aa9ee160"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.606459 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-442db" podStartSLOduration=125.606443505 podStartE2EDuration="2m5.606443505s" podCreationTimestamp="2025-09-30 05:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.603846336 +0000 UTC m=+144.930966861" watchObservedRunningTime="2025-09-30 05:31:14.606443505 +0000 UTC m=+144.933564030" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.631191 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xg6dj" event={"ID":"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc","Type":"ContainerStarted","Data":"7004f57e9fa9ac2cba3469148ab7f6d08f06f53f4584fc505aa5cdf6a55cb91d"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.631235 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xg6dj" event={"ID":"7aa8b675-10f2-4d8e-8e3a-79359f16d7bc","Type":"ContainerStarted","Data":"f0e35fe5d40302a68970477542ff5cd90d79aa6c20e14cf5e3958d9f79eaf937"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.635041 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.636097 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.136082566 +0000 UTC m=+145.463203091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.652705 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" event={"ID":"1e8969ec-fae5-48f6-afe3-f230deb5f802","Type":"ContainerStarted","Data":"208af762f28680d2052b4567b8904e91ea31c9965f34ade8081118259e089519"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.652746 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" event={"ID":"1e8969ec-fae5-48f6-afe3-f230deb5f802","Type":"ContainerStarted","Data":"0225e9ef31a7b6214e9a459106ded7794f5d6e7b9d2e3c1f3e61802d93791b67"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.653422 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.665350 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" event={"ID":"aa301454-b9a6-4bed-acd1-f2cf109b5259","Type":"ContainerStarted","Data":"866c59cef97361f34d4a8c2eb1fa79e31e788aea19eb2dae373d75545b6992e5"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.680681 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" podStartSLOduration=123.680664088 podStartE2EDuration="2m3.680664088s" podCreationTimestamp="2025-09-30 05:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.680207793 +0000 UTC m=+145.007328318" watchObservedRunningTime="2025-09-30 05:31:14.680664088 +0000 UTC m=+145.007784623" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.681158 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" event={"ID":"00da0261-ae3c-40c6-a3a3-b7f8aaf3aff7","Type":"ContainerStarted","Data":"5dcb46572d6a99443b2a96a81074dc8603d2fd85df439559674e8d4eb46a2f82"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.686925 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" event={"ID":"fa42e960-679c-45f3-ae65-5ab2ebbc81e5","Type":"ContainerStarted","Data":"9ae7e66d02f715c61d9d73949c84d5098473a5d275d366356cb768a7a8dd0353"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.686975 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" event={"ID":"fa42e960-679c-45f3-ae65-5ab2ebbc81e5","Type":"ContainerStarted","Data":"4964b56491217ed5c97ed6b942a91116d252c27005bae64985fd3848c97f1174"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.687749 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.689203 4956 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tzwdp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.689230 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" podUID="fa42e960-679c-45f3-ae65-5ab2ebbc81e5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.690427 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" podStartSLOduration=124.690409372 podStartE2EDuration="2m4.690409372s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.643350744 +0000 UTC m=+144.970471269" watchObservedRunningTime="2025-09-30 05:31:14.690409372 +0000 UTC m=+145.017529897" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.718141 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" event={"ID":"2ad2e251-9549-4f18-8467-8dc98924bc23","Type":"ContainerStarted","Data":"d32cb6d17929cfe29c18b8fd18039796041ef54d70233b2032e12ef39f78e80f"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.718185 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.760234 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.761560 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.26153953 +0000 UTC m=+145.588660055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.777914 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pclsc" podStartSLOduration=124.777889008 podStartE2EDuration="2m4.777889008s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.77445114 +0000 UTC m=+145.101571655" watchObservedRunningTime="2025-09-30 05:31:14.777889008 +0000 UTC m=+145.105009533" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.779399 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xg6dj" podStartSLOduration=124.779369499 podStartE2EDuration="2m4.779369499s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.75804289 +0000 UTC m=+145.085163415" watchObservedRunningTime="2025-09-30 05:31:14.779369499 +0000 UTC m=+145.106490044" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.808354 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" podStartSLOduration=74.808331738 podStartE2EDuration="1m14.808331738s" podCreationTimestamp="2025-09-30 05:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.808298556 +0000 UTC m=+145.135419081" watchObservedRunningTime="2025-09-30 05:31:14.808331738 +0000 UTC m=+145.135452253" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.839281 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" event={"ID":"56a1ba28-4b12-49ea-a04d-a9ab766a187f","Type":"ContainerStarted","Data":"2c49794a03f0ae584f25045a8b31e7f2f301bc695859229d01d6f4ee2edf95f2"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.839322 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" event={"ID":"56a1ba28-4b12-49ea-a04d-a9ab766a187f","Type":"ContainerStarted","Data":"eb9b66e8288964e63eb2616a32b3f01415dd675487400a272840889a07e5f8d8"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.849469 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6hhrb" podStartSLOduration=123.849454501 podStartE2EDuration="2m3.849454501s" podCreationTimestamp="2025-09-30 05:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.848330973 +0000 UTC m=+145.175451498" watchObservedRunningTime="2025-09-30 05:31:14.849454501 +0000 UTC m=+145.176575026" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.857506 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.859773 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.859815 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.862581 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.862842 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.362831978 +0000 UTC m=+145.689952503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.887721 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" podStartSLOduration=124.887707338 podStartE2EDuration="2m4.887707338s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.886478205 +0000 UTC m=+145.213598720" watchObservedRunningTime="2025-09-30 05:31:14.887707338 +0000 UTC m=+145.214827863" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.896575 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" event={"ID":"9ee01037-07c8-471e-860f-801980e351f8","Type":"ContainerStarted","Data":"002400c5c4976a46772f09b0f44b4ffea7d412dc64f0d04f616b9e5b63a8374a"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.908660 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v25lf" event={"ID":"91203589-4d13-4929-8bd1-28f8a40e2b44","Type":"ContainerStarted","Data":"59f444e7b6707c13175d0e984ce65c617a42b24d4022b95f7ddbd56ca9bb333c"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.909449 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-v25lf" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.911587 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-v25lf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.911647 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v25lf" podUID="91203589-4d13-4929-8bd1-28f8a40e2b44" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.920962 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" event={"ID":"ee7ae876-4694-4cca-92e9-80528dc98bb4","Type":"ContainerStarted","Data":"1246ba6df00a2d9dcedd4bd43f5b97b10601971da44e0933b4a9a7921c52fbe9"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.940237 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" podStartSLOduration=124.940193009 podStartE2EDuration="2m4.940193009s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.930937114 +0000 UTC m=+145.258057639" watchObservedRunningTime="2025-09-30 05:31:14.940193009 +0000 UTC m=+145.267313534" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.961659 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-v25lf" podStartSLOduration=124.961645982 podStartE2EDuration="2m4.961645982s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.956949851 +0000 UTC m=+145.284070376" watchObservedRunningTime="2025-09-30 05:31:14.961645982 +0000 UTC m=+145.288766507" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.963071 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.963174 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.463159964 +0000 UTC m=+145.790280489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.963437 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:14 crc kubenswrapper[4956]: E0930 05:31:14.965269 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.465257736 +0000 UTC m=+145.792378261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.967288 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" event={"ID":"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6","Type":"ContainerStarted","Data":"5074a10d3fd442d17f5a7f9d69e434dcbb195d64ae5404217c38f4097925cac4"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.967328 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" event={"ID":"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6","Type":"ContainerStarted","Data":"e20b73ccbc15ddae486727232d26ef73c6768a3c57241cf504a0dd0bb71f39ab"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.982775 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" event={"ID":"3a4863df-527f-4038-be9f-286437603a44","Type":"ContainerStarted","Data":"cebd409cb38e5cb0e7e80fccff653c70235c619e4d823740747cb27d26d60ed0"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.982837 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" event={"ID":"3a4863df-527f-4038-be9f-286437603a44","Type":"ContainerStarted","Data":"ffb203405ac34477e50e071c397a3892a4ab0bcf83a29894b9da1c7237d63281"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.996624 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" event={"ID":"983d58af-8688-438e-8f8c-c658d1a9bdac","Type":"ContainerStarted","Data":"478a98f09a33b33f202fcc3468eb6677ed04caf907f9edb5d277a604f3ddfbaf"} Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.997499 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.998337 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" podStartSLOduration=124.998321694 podStartE2EDuration="2m4.998321694s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:14.996559914 +0000 UTC m=+145.323680439" watchObservedRunningTime="2025-09-30 05:31:14.998321694 +0000 UTC m=+145.325442219" Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.998641 4956 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-6mbsg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Sep 30 05:31:14 crc kubenswrapper[4956]: I0930 05:31:14.998690 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" podUID="983d58af-8688-438e-8f8c-c658d1a9bdac" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.001543 4956 generic.go:334] "Generic (PLEG): container finished" podID="d43f9c60-0360-4486-9c31-ecebf460d114" containerID="fb24f5ea0517442efbe0623f3b7a1d1440b0593d808a0b31767da6d96c7f4a98" exitCode=0 Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.001621 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" event={"ID":"d43f9c60-0360-4486-9c31-ecebf460d114","Type":"ContainerDied","Data":"fb24f5ea0517442efbe0623f3b7a1d1440b0593d808a0b31767da6d96c7f4a98"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.006836 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" event={"ID":"fab6b2a0-7e69-4446-88fa-094cf0ebcf12","Type":"ContainerStarted","Data":"0fcc1f30d633ff5530553fef217f8d847c7d9b49d3fa6d93d8923b35bd2cf406"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.018920 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8658c" event={"ID":"1454fd6b-7fe7-41bf-a8d0-0df3ca5f9d03","Type":"ContainerStarted","Data":"f4b95f751489c2227981b7edb1354b2acf6c289f4a914a77fecfa6cc2331d6e3"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.023758 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" event={"ID":"2774c9ff-8b7d-4937-a6bb-86746bd4829f","Type":"ContainerStarted","Data":"716fb7bf6306909fac1420f48e6a1e24472297d5d96970d3eeb5f123136a819f"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.023794 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" event={"ID":"2774c9ff-8b7d-4937-a6bb-86746bd4829f","Type":"ContainerStarted","Data":"420c0522134a709321df5284a41dfdabc76f7a66a3b03304e828ded3e4ff070e"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.028406 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" event={"ID":"c0770b87-e3c3-49d5-a290-3a3cd64367a7","Type":"ContainerStarted","Data":"b2df3abb7aaf9f109c262fec92ee36c0634658cb7128c520f4cb34322264d5f9"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.028443 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" event={"ID":"c0770b87-e3c3-49d5-a290-3a3cd64367a7","Type":"ContainerStarted","Data":"2822581cc3a57673d02e30908ea2abdd14798c294f12866d3c3600bc5441dd42"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.044803 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l87pb" podStartSLOduration=125.04478345 podStartE2EDuration="2m5.04478345s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.040060029 +0000 UTC m=+145.367180564" watchObservedRunningTime="2025-09-30 05:31:15.04478345 +0000 UTC m=+145.371903975" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.053981 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w22x5" event={"ID":"60e6248e-c328-4964-a551-34f6ab016589","Type":"ContainerStarted","Data":"c01a270828e6432955a012e57b402ff7f24281cb840a3dcc83fe28daed02ebb7"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.068607 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.070080 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.570058853 +0000 UTC m=+145.897179368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.061207 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" event={"ID":"36e17eaf-9884-4aa0-bd1e-0768c2e592f9","Type":"ContainerStarted","Data":"cfb04ed840387a47c94436cf63cfb59453adc77500c1302f60d7ee4053fa3473"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.083654 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" podStartSLOduration=124.083637567 podStartE2EDuration="2m4.083637567s" podCreationTimestamp="2025-09-30 05:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.082949593 +0000 UTC m=+145.410070128" watchObservedRunningTime="2025-09-30 05:31:15.083637567 +0000 UTC m=+145.410758092" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.102063 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l45c8" event={"ID":"5027231b-70a0-4243-aa58-2e2f14dbba62","Type":"ContainerStarted","Data":"95eefb3eccfff69c32f7d5a8f5e9b6eac7eb39cfb63d93aa1e282ca125066f16"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.124015 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-zhkrt" podStartSLOduration=125.123989425 podStartE2EDuration="2m5.123989425s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.123809978 +0000 UTC m=+145.450930503" watchObservedRunningTime="2025-09-30 05:31:15.123989425 +0000 UTC m=+145.451109950" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.149727 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" event={"ID":"ebad915a-95b0-4596-8cbb-3d4acc6eb32a","Type":"ContainerStarted","Data":"b3b5f33fa58e55848dd341d6d416e01ba85b360d5806ca74773d1a519ba2e513"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.150167 4956 patch_prober.go:28] interesting pod/console-operator-58897d9998-ztg8w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.150200 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" podUID="ebad915a-95b0-4596-8cbb-3d4acc6eb32a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.154197 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" event={"ID":"9ec8ec3c-f0d3-41b1-a311-2eca015cd63a","Type":"ContainerStarted","Data":"736b15f956185aec47b732393f1267dd9f62c858cb80e398a6971acf5079191a"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.157255 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" event={"ID":"dfbe88f3-0f36-4ac5-9d56-f80d07af868a","Type":"ContainerStarted","Data":"92ad97a53557d4b866242a7eafcf2617d3efecac460eb86ca65a2019e96e9b1c"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.157296 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" event={"ID":"dfbe88f3-0f36-4ac5-9d56-f80d07af868a","Type":"ContainerStarted","Data":"18f0b37b0794c3be39d4bfbf7f4ce90abee5b9d8e55a8eef26383ee842203223"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.158099 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.159211 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" event={"ID":"25b00db7-971e-4eb8-8303-e6020a9ebc3c","Type":"ContainerStarted","Data":"a5ba4fd39586cf3e75364bb44b15cc70be16bb61c1c8182ef56cd899619c9750"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.160364 4956 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-j7mgz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.160401 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" podUID="dfbe88f3-0f36-4ac5-9d56-f80d07af868a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.161340 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" event={"ID":"bcb2fabd-b2e0-41f3-8677-e55602876ca9","Type":"ContainerStarted","Data":"5e26a1d5e053c932962c383fd749ba72dc491a3cbe1b775a684439926d5716bb"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.165426 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" event={"ID":"55234365-65bf-4faa-9f74-dd50194f062e","Type":"ContainerStarted","Data":"f1ff53789daa1ea02aa8ac5516df21d122b8215d67e181c3507f2ab2ee19c039"} Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.166133 4956 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x9c8r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.166161 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" podUID="39bae241-73b1-4078-9861-309c762b38b5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.170230 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.171439 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.671427444 +0000 UTC m=+145.998547969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.171565 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-42782" podStartSLOduration=125.171548008 podStartE2EDuration="2m5.171548008s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.168917609 +0000 UTC m=+145.496038144" watchObservedRunningTime="2025-09-30 05:31:15.171548008 +0000 UTC m=+145.498668543" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.194313 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.208816 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" podStartSLOduration=125.20880046 podStartE2EDuration="2m5.20880046s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.206460801 +0000 UTC m=+145.533581316" watchObservedRunningTime="2025-09-30 05:31:15.20880046 +0000 UTC m=+145.535920985" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.247079 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjwhf" podStartSLOduration=125.247062296 podStartE2EDuration="2m5.247062296s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.244982736 +0000 UTC m=+145.572103261" watchObservedRunningTime="2025-09-30 05:31:15.247062296 +0000 UTC m=+145.574182821" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.276566 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.277753 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.777736684 +0000 UTC m=+146.104857209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.314946 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l45c8" podStartSLOduration=6.314922183 podStartE2EDuration="6.314922183s" podCreationTimestamp="2025-09-30 05:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.299748576 +0000 UTC m=+145.626869101" watchObservedRunningTime="2025-09-30 05:31:15.314922183 +0000 UTC m=+145.642042728" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.378824 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.379146 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.879134525 +0000 UTC m=+146.206255040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.401389 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" podStartSLOduration=125.401368765 podStartE2EDuration="2m5.401368765s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.399928146 +0000 UTC m=+145.727048671" watchObservedRunningTime="2025-09-30 05:31:15.401368765 +0000 UTC m=+145.728489280" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.401658 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-44xzl" podStartSLOduration=124.401651244 podStartE2EDuration="2m4.401651244s" podCreationTimestamp="2025-09-30 05:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.370716349 +0000 UTC m=+145.697836874" watchObservedRunningTime="2025-09-30 05:31:15.401651244 +0000 UTC m=+145.728771769" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.479737 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.480034 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.9800097 +0000 UTC m=+146.307130225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.480410 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.480743 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:15.980727835 +0000 UTC m=+146.307848370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.491376 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" podStartSLOduration=125.491357787 podStartE2EDuration="2m5.491357787s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.490982164 +0000 UTC m=+145.818102699" watchObservedRunningTime="2025-09-30 05:31:15.491357787 +0000 UTC m=+145.818478312" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.540802 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8658c" podStartSLOduration=6.540784425 podStartE2EDuration="6.540784425s" podCreationTimestamp="2025-09-30 05:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.536386645 +0000 UTC m=+145.863507170" watchObservedRunningTime="2025-09-30 05:31:15.540784425 +0000 UTC m=+145.867904950" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.581574 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.581855 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.081833967 +0000 UTC m=+146.408954492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.651809 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" podStartSLOduration=125.651792445 podStartE2EDuration="2m5.651792445s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.651420402 +0000 UTC m=+145.978540927" watchObservedRunningTime="2025-09-30 05:31:15.651792445 +0000 UTC m=+145.978912970" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.675621 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" podStartSLOduration=125.675604538 podStartE2EDuration="2m5.675604538s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.673870849 +0000 UTC m=+146.000991374" watchObservedRunningTime="2025-09-30 05:31:15.675604538 +0000 UTC m=+146.002725053" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.683095 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.683443 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.183426975 +0000 UTC m=+146.510547500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.698480 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" podStartSLOduration=125.698463949 podStartE2EDuration="2m5.698463949s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.695726015 +0000 UTC m=+146.022846540" watchObservedRunningTime="2025-09-30 05:31:15.698463949 +0000 UTC m=+146.025584484" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.717287 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5j47n" podStartSLOduration=125.717273471 podStartE2EDuration="2m5.717273471s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.715148818 +0000 UTC m=+146.042269343" watchObservedRunningTime="2025-09-30 05:31:15.717273471 +0000 UTC m=+146.044393996" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.755221 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" podStartSLOduration=125.755206376 podStartE2EDuration="2m5.755206376s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.754541873 +0000 UTC m=+146.081662398" watchObservedRunningTime="2025-09-30 05:31:15.755206376 +0000 UTC m=+146.082326901" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.802026 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.802645 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.302627904 +0000 UTC m=+146.629748429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.802742 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zhmb6" podStartSLOduration=125.802657266 podStartE2EDuration="2m5.802657266s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.801534707 +0000 UTC m=+146.128655232" watchObservedRunningTime="2025-09-30 05:31:15.802657266 +0000 UTC m=+146.129777791" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.833066 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" podStartSLOduration=125.833051953 podStartE2EDuration="2m5.833051953s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.832516236 +0000 UTC m=+146.159636761" watchObservedRunningTime="2025-09-30 05:31:15.833051953 +0000 UTC m=+146.160172478" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.865356 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:15 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:15 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:15 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.865745 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:15 crc kubenswrapper[4956]: I0930 05:31:15.904836 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:15 crc kubenswrapper[4956]: E0930 05:31:15.905362 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.405350422 +0000 UTC m=+146.732470937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.006639 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.006832 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.506807116 +0000 UTC m=+146.833927641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.006876 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.007283 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.507266881 +0000 UTC m=+146.834387406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.108519 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.108712 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.608688874 +0000 UTC m=+146.935809399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.108866 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.109163 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.60915197 +0000 UTC m=+146.936272495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.170940 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" event={"ID":"90e0bca8-7e63-438d-95ca-7b855c885655","Type":"ContainerStarted","Data":"5491eea9a90958c63031129801bc627d438ca655b2fca93f55cb228b41496183"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.172143 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69hd4" event={"ID":"8fb391e0-dd9a-447f-af99-8ec70c6221bd","Type":"ContainerStarted","Data":"0cc68d4ce98e18897ae94910f51aba0812109cec22c74f932d47cd5c200c690b"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.173498 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" event={"ID":"1e8969ec-fae5-48f6-afe3-f230deb5f802","Type":"ContainerStarted","Data":"e3a0476242aae886f368df129aa68eeb4075e54021a75ee1ea22a8393e97a4a7"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.175511 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" event={"ID":"4608fdfb-f63a-4992-889e-3bbf043257b6","Type":"ContainerStarted","Data":"e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.175728 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.177325 4956 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4fn5g container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.177376 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" podUID="4608fdfb-f63a-4992-889e-3bbf043257b6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.177879 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gbs8" event={"ID":"25b00db7-971e-4eb8-8303-e6020a9ebc3c","Type":"ContainerStarted","Data":"30159ae76f82df409df6f3bd90b5a61972569678d57e26f027c71c662f608841"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.179983 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" event={"ID":"d43f9c60-0360-4486-9c31-ecebf460d114","Type":"ContainerStarted","Data":"77be832e852ad266778eed1d0c8c6b28395411f76d2b24d5fd8b0f9dea7ca16f"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.181478 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-t5shm" event={"ID":"bcb2fabd-b2e0-41f3-8677-e55602876ca9","Type":"ContainerStarted","Data":"e06beadc041c1d77e8e5be278bbf3a78a59ec41d6d0730f82feadd931282856d"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.182493 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l45c8" event={"ID":"5027231b-70a0-4243-aa58-2e2f14dbba62","Type":"ContainerStarted","Data":"b25a6cbc1c9e7ec626b01368e222faff69db4aabc4da58bc2f555315cd0823f0"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.184336 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w22x5" event={"ID":"60e6248e-c328-4964-a551-34f6ab016589","Type":"ContainerStarted","Data":"1d135b139a841ec212e8830a563f2eef1d7b2ffc8bf0e1ef4516c6d5827f8568"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.184377 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w22x5" event={"ID":"60e6248e-c328-4964-a551-34f6ab016589","Type":"ContainerStarted","Data":"93c257f9a67201bc8ec4817f66dffc13798e5828573dd2727cdddbf1f5761f5a"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.184472 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.187603 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v25lf" event={"ID":"91203589-4d13-4929-8bd1-28f8a40e2b44","Type":"ContainerStarted","Data":"ea809f42371fea2be6756bc1e7e8101de24bbb7a8758b670741c01627878c3d3"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.188211 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-v25lf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.188257 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v25lf" podUID="91203589-4d13-4929-8bd1-28f8a40e2b44" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.189313 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8rd8z" event={"ID":"56a1ba28-4b12-49ea-a04d-a9ab766a187f","Type":"ContainerStarted","Data":"e8de6bbb8101b06ad33b3be3c8ef392377de0846e5e314769856fb198a2d6052"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.191248 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rmc7z" event={"ID":"7f498cc9-9022-4bb8-a1ab-fda9a5cd39f6","Type":"ContainerStarted","Data":"f048da58b97b8c5093bb385a7faf7a821c64309be74e82613e50cb2ad00b0aee"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.193249 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5v6z6" event={"ID":"3a4863df-527f-4038-be9f-286437603a44","Type":"ContainerStarted","Data":"34ba6e2bc5a5c4a7287247fb2d361b33fc18bd7ea1dadc2e8f306a8f92f2a409"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.195201 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sgr97" event={"ID":"55234365-65bf-4faa-9f74-dd50194f062e","Type":"ContainerStarted","Data":"605272edb853352fe72197cc97c4a4310c4497bfeefe397fc7dcb47f04c848e5"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.196430 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9hbc6" event={"ID":"fab6b2a0-7e69-4446-88fa-094cf0ebcf12","Type":"ContainerStarted","Data":"1ecd941e092dc60262b8df59bd6f97a8968d83a706b970fe2966c28b9ddf954c"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.197923 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" event={"ID":"15065912-8d28-47dd-adaf-f7aedc754526","Type":"ContainerStarted","Data":"e8c8c621da2a97d635756073942e41233ac77bafad2256fbd9a47c2cda828d6a"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.199457 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" event={"ID":"e1b8e355-5b80-4df5-9742-ccf35b26d474","Type":"ContainerStarted","Data":"f27fe3148bf401365eb3fd2de85de1253ff65b4be83e10386542dcc66d502394"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.199566 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" event={"ID":"e1b8e355-5b80-4df5-9742-ccf35b26d474","Type":"ContainerStarted","Data":"c1844a6c8c69b9946ae9dc51f09eed61dcfdc14b00ee24e0b0dc95d57f4eb94b"} Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.200337 4956 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rvfx8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.200382 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" podUID="9bdc12c7-725c-4496-8e0e-e4ec3d911cca" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.212689 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j7mgz" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.214047 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.215731 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.715717469 +0000 UTC m=+147.042837994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.220460 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.223302 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.723288417 +0000 UTC m=+147.050408942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.235335 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.241285 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ztg8w" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.265710 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6mbsg" podStartSLOduration=126.265688934 podStartE2EDuration="2m6.265688934s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:15.866158853 +0000 UTC m=+146.193279388" watchObservedRunningTime="2025-09-30 05:31:16.265688934 +0000 UTC m=+146.592809459" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.267057 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" podStartSLOduration=126.267048481 podStartE2EDuration="2m6.267048481s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:16.2652588 +0000 UTC m=+146.592379335" watchObservedRunningTime="2025-09-30 05:31:16.267048481 +0000 UTC m=+146.594169026" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.348192 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.348854 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.848840344 +0000 UTC m=+147.175960869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.451455 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.451856 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:16.95183919 +0000 UTC m=+147.278959715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.499548 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w22x5" podStartSLOduration=7.499534019 podStartE2EDuration="7.499534019s" podCreationTimestamp="2025-09-30 05:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:16.495349306 +0000 UTC m=+146.822469831" watchObservedRunningTime="2025-09-30 05:31:16.499534019 +0000 UTC m=+146.826654544" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.551980 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.552366 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.052349752 +0000 UTC m=+147.379470277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.564679 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" podStartSLOduration=126.564663052 podStartE2EDuration="2m6.564663052s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:16.557218929 +0000 UTC m=+146.884339454" watchObservedRunningTime="2025-09-30 05:31:16.564663052 +0000 UTC m=+146.891783577" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.596891 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mwj5h" podStartSLOduration=126.596874872 podStartE2EDuration="2m6.596874872s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:16.596706016 +0000 UTC m=+146.923826541" watchObservedRunningTime="2025-09-30 05:31:16.596874872 +0000 UTC m=+146.923995397" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.653434 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.653729 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.153715923 +0000 UTC m=+147.480836448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.670686 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.670944 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.693134 4956 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mrbck container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.693179 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" podUID="90e0bca8-7e63-438d-95ca-7b855c885655" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.754620 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.754799 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.254773063 +0000 UTC m=+147.581893588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.754917 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.755260 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.25524961 +0000 UTC m=+147.582370125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.770624 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" podStartSLOduration=126.770604013 podStartE2EDuration="2m6.770604013s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:16.768482422 +0000 UTC m=+147.095602947" watchObservedRunningTime="2025-09-30 05:31:16.770604013 +0000 UTC m=+147.097724538" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.771837 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s8rtf" podStartSLOduration=126.771829836 podStartE2EDuration="2m6.771829836s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:16.670383682 +0000 UTC m=+146.997504207" watchObservedRunningTime="2025-09-30 05:31:16.771829836 +0000 UTC m=+147.098950361" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.853815 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.854672 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.856305 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.856610 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.356592149 +0000 UTC m=+147.683712674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.860852 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:16 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:16 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:16 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.860918 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:16 crc kubenswrapper[4956]: I0930 05:31:16.957697 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:16 crc kubenswrapper[4956]: E0930 05:31:16.958001 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.457990192 +0000 UTC m=+147.785110717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.036099 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s62b4" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.059281 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.059840 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.559824558 +0000 UTC m=+147.886945083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.162487 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.162770 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.662759052 +0000 UTC m=+147.989879577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.200546 4956 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tzwdp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.200834 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" podUID="fa42e960-679c-45f3-ae65-5ab2ebbc81e5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.221177 4956 generic.go:334] "Generic (PLEG): container finished" podID="140b2851-bf05-4ec3-87db-657aacefdbd4" containerID="a8158496e72dde154eee6af181c09d157c45adbc5671ed3ffee47dfc313fcd29" exitCode=0 Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.221232 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" event={"ID":"140b2851-bf05-4ec3-87db-657aacefdbd4","Type":"ContainerDied","Data":"a8158496e72dde154eee6af181c09d157c45adbc5671ed3ffee47dfc313fcd29"} Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.247668 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69hd4" event={"ID":"8fb391e0-dd9a-447f-af99-8ec70c6221bd","Type":"ContainerStarted","Data":"fd28192cf5c2450b04d5662d0039333912d69d1d34da857eeeddcf673a4ae569"} Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.248812 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-v25lf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.248851 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v25lf" podUID="91203589-4d13-4929-8bd1-28f8a40e2b44" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.259997 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.265771 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.267324 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.767308992 +0000 UTC m=+148.094429517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.267398 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.267942 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.767935953 +0000 UTC m=+148.095056478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.361147 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tzwdp" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.369411 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.369902 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.869886664 +0000 UTC m=+148.197007189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.478918 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.478963 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.479020 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.479044 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.479061 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.479370 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:17.979354491 +0000 UTC m=+148.306475006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.485397 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.491895 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.516896 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.541780 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.581790 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.582104 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.082089589 +0000 UTC m=+148.409210114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.654955 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mmr6c"] Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.656083 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.662607 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.664376 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.672921 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmr6c"] Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.681617 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.682862 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.683154 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.1831424 +0000 UTC m=+148.510262925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.715371 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.792639 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.792798 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs9gq\" (UniqueName: \"kubernetes.io/projected/cb546d14-6d5d-4b44-b501-070ea2251e4b-kube-api-access-fs9gq\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.792875 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-catalog-content\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.792917 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-utilities\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.793266 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.293247789 +0000 UTC m=+148.620368314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.854995 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4nrh7"] Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.855862 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.859469 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.861879 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:17 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:17 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:17 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.862103 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.885990 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4nrh7"] Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.898577 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kzn2\" (UniqueName: \"kubernetes.io/projected/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-kube-api-access-5kzn2\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.898629 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-catalog-content\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.898662 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.898680 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-utilities\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.898710 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs9gq\" (UniqueName: \"kubernetes.io/projected/cb546d14-6d5d-4b44-b501-070ea2251e4b-kube-api-access-fs9gq\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.898741 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-utilities\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.898757 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-catalog-content\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.899149 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-catalog-content\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: E0930 05:31:17.899382 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.399371602 +0000 UTC m=+148.726492127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.899733 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-utilities\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.944194 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs9gq\" (UniqueName: \"kubernetes.io/projected/cb546d14-6d5d-4b44-b501-070ea2251e4b-kube-api-access-fs9gq\") pod \"certified-operators-mmr6c\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.983654 4956 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.987359 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:17 crc kubenswrapper[4956]: I0930 05:31:17.999929 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.000131 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-catalog-content\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.000163 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kzn2\" (UniqueName: \"kubernetes.io/projected/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-kube-api-access-5kzn2\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.000267 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-utilities\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.000623 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-utilities\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:18 crc kubenswrapper[4956]: E0930 05:31:18.000944 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.50092881 +0000 UTC m=+148.828049345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.001288 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-catalog-content\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.045892 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kzn2\" (UniqueName: \"kubernetes.io/projected/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-kube-api-access-5kzn2\") pod \"community-operators-4nrh7\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.060132 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rk8m8"] Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.069604 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.077252 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.077296 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.095497 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rk8m8"] Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.109223 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-catalog-content\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.109280 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.109306 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kkpp\" (UniqueName: \"kubernetes.io/projected/7b351e8a-d6e9-426c-a3fc-b486964f28c7-kube-api-access-7kkpp\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.109330 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-utilities\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: E0930 05:31:18.109600 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.609586889 +0000 UTC m=+148.936707414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.183392 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.212630 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.212873 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kkpp\" (UniqueName: \"kubernetes.io/projected/7b351e8a-d6e9-426c-a3fc-b486964f28c7-kube-api-access-7kkpp\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.212922 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-utilities\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.212997 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-catalog-content\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.213564 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-catalog-content\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: E0930 05:31:18.214061 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.714041365 +0000 UTC m=+149.041161900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.214580 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-utilities\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.244389 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.253940 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.277249 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bvx6l"] Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.278233 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.291524 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69hd4" event={"ID":"8fb391e0-dd9a-447f-af99-8ec70c6221bd","Type":"ContainerStarted","Data":"64c1a65ad27cb866e25541420515be0bceb1d45a0d86e954d4c797134a488c56"} Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.291573 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69hd4" event={"ID":"8fb391e0-dd9a-447f-af99-8ec70c6221bd","Type":"ContainerStarted","Data":"3af1bfe1f5e3213bf5ba21b170e11870d0271ec374ce4b3f78d915b0aaecb09e"} Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.292878 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kkpp\" (UniqueName: \"kubernetes.io/projected/7b351e8a-d6e9-426c-a3fc-b486964f28c7-kube-api-access-7kkpp\") pod \"certified-operators-rk8m8\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.310075 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-chdm2" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.311055 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bvx6l"] Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.315636 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:18 crc kubenswrapper[4956]: E0930 05:31:18.316188 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.816131741 +0000 UTC m=+149.143252266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.416633 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.417032 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gs8h\" (UniqueName: \"kubernetes.io/projected/57114151-313a-4d50-bf79-8d92e5c50208-kube-api-access-2gs8h\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.418423 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-catalog-content\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.421425 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-utilities\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: E0930 05:31:18.423177 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 05:31:18.923154905 +0000 UTC m=+149.250275430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.457779 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.525231 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.525448 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gs8h\" (UniqueName: \"kubernetes.io/projected/57114151-313a-4d50-bf79-8d92e5c50208-kube-api-access-2gs8h\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.525557 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-catalog-content\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.525654 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-utilities\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: E0930 05:31:18.525929 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 05:31:19.025909383 +0000 UTC m=+149.353029918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5kh24" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.526254 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-utilities\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.526667 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-catalog-content\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.556315 4956 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T05:31:17.98367999Z","Handler":null,"Name":""} Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.561389 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-69hd4" podStartSLOduration=9.561370654 podStartE2EDuration="9.561370654s" podCreationTimestamp="2025-09-30 05:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:18.515502528 +0000 UTC m=+148.842623073" watchObservedRunningTime="2025-09-30 05:31:18.561370654 +0000 UTC m=+148.888491179" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.584488 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gs8h\" (UniqueName: \"kubernetes.io/projected/57114151-313a-4d50-bf79-8d92e5c50208-kube-api-access-2gs8h\") pod \"community-operators-bvx6l\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.593451 4956 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.593488 4956 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.626889 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.631411 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.637816 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.728128 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.747368 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.747410 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:18 crc kubenswrapper[4956]: W0930 05:31:18.769106 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-b02951ffee80b0abcc98dc1d53dbc472598d6a8f06d3e2f7191a321639ddff3a WatchSource:0}: Error finding container b02951ffee80b0abcc98dc1d53dbc472598d6a8f06d3e2f7191a321639ddff3a: Status 404 returned error can't find the container with id b02951ffee80b0abcc98dc1d53dbc472598d6a8f06d3e2f7191a321639ddff3a Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.859333 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:18 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:18 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:18 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.859382 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.861835 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5kh24\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:18 crc kubenswrapper[4956]: W0930 05:31:18.889505 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-f177dec41ace097acdb6055a4d91ebd4544f76236e9ec7f3f732c8245f7f6785 WatchSource:0}: Error finding container f177dec41ace097acdb6055a4d91ebd4544f76236e9ec7f3f732c8245f7f6785: Status 404 returned error can't find the container with id f177dec41ace097acdb6055a4d91ebd4544f76236e9ec7f3f732c8245f7f6785 Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.921476 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.938797 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtlz5\" (UniqueName: \"kubernetes.io/projected/140b2851-bf05-4ec3-87db-657aacefdbd4-kube-api-access-jtlz5\") pod \"140b2851-bf05-4ec3-87db-657aacefdbd4\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.938856 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume\") pod \"140b2851-bf05-4ec3-87db-657aacefdbd4\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.938913 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/140b2851-bf05-4ec3-87db-657aacefdbd4-secret-volume\") pod \"140b2851-bf05-4ec3-87db-657aacefdbd4\" (UID: \"140b2851-bf05-4ec3-87db-657aacefdbd4\") " Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.940127 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume" (OuterVolumeSpecName: "config-volume") pod "140b2851-bf05-4ec3-87db-657aacefdbd4" (UID: "140b2851-bf05-4ec3-87db-657aacefdbd4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.947893 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140b2851-bf05-4ec3-87db-657aacefdbd4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "140b2851-bf05-4ec3-87db-657aacefdbd4" (UID: "140b2851-bf05-4ec3-87db-657aacefdbd4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.957671 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140b2851-bf05-4ec3-87db-657aacefdbd4-kube-api-access-jtlz5" (OuterVolumeSpecName: "kube-api-access-jtlz5") pod "140b2851-bf05-4ec3-87db-657aacefdbd4" (UID: "140b2851-bf05-4ec3-87db-657aacefdbd4"). InnerVolumeSpecName "kube-api-access-jtlz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:31:18 crc kubenswrapper[4956]: I0930 05:31:18.983626 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmr6c"] Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.011719 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4nrh7"] Sep 30 05:31:19 crc kubenswrapper[4956]: W0930 05:31:19.014953 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb546d14_6d5d_4b44_b501_070ea2251e4b.slice/crio-7689d7801082f87063ffa49011bc986a8e8cae285c6a77722fe82544e51828bd WatchSource:0}: Error finding container 7689d7801082f87063ffa49011bc986a8e8cae285c6a77722fe82544e51828bd: Status 404 returned error can't find the container with id 7689d7801082f87063ffa49011bc986a8e8cae285c6a77722fe82544e51828bd Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.041557 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtlz5\" (UniqueName: \"kubernetes.io/projected/140b2851-bf05-4ec3-87db-657aacefdbd4-kube-api-access-jtlz5\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.041599 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/140b2851-bf05-4ec3-87db-657aacefdbd4-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.041613 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/140b2851-bf05-4ec3-87db-657aacefdbd4-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.071016 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.075341 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 05:31:19 crc kubenswrapper[4956]: E0930 05:31:19.075588 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140b2851-bf05-4ec3-87db-657aacefdbd4" containerName="collect-profiles" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.075607 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="140b2851-bf05-4ec3-87db-657aacefdbd4" containerName="collect-profiles" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.075700 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="140b2851-bf05-4ec3-87db-657aacefdbd4" containerName="collect-profiles" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.076084 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.078672 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.080788 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.086751 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.130171 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rk8m8"] Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.142868 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.143001 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.245646 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.246092 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.246478 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.276444 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.309017 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nrh7" event={"ID":"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0","Type":"ContainerStarted","Data":"951dc5b9dd7e7d5b4f4b6c77ec782d773c3faee534eadddea1220860158fa57d"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.318880 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0fa613ae017fe5f276b5411e2b95522bbda296ab465dae41599264d55da3c74b"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.318924 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"df826cc136fc677a9b104b9eba680a513d96c93b9c5349e5c95c78e78c43c35f"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.331278 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk8m8" event={"ID":"7b351e8a-d6e9-426c-a3fc-b486964f28c7","Type":"ContainerStarted","Data":"30ae1bd6981b764824ec8096a1589e444907bd9ab4e4f9821042c48a6d3207d9"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.336926 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.337198 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht" event={"ID":"140b2851-bf05-4ec3-87db-657aacefdbd4","Type":"ContainerDied","Data":"7d7487f28f9b8d155df2d31f952ae540a39467741c335ac6290b85127a6ae447"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.337247 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7487f28f9b8d155df2d31f952ae540a39467741c335ac6290b85127a6ae447" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.338769 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmr6c" event={"ID":"cb546d14-6d5d-4b44-b501-070ea2251e4b","Type":"ContainerStarted","Data":"7689d7801082f87063ffa49011bc986a8e8cae285c6a77722fe82544e51828bd"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.340770 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f177dec41ace097acdb6055a4d91ebd4544f76236e9ec7f3f732c8245f7f6785"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.341263 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.342606 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4594e64da806a651209a5826c662e431ab0d066c6a3680a526a1820d95d006a8"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.342636 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b02951ffee80b0abcc98dc1d53dbc472598d6a8f06d3e2f7191a321639ddff3a"} Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.431581 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.452743 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bvx6l"] Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.581517 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5kh24"] Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.719252 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 05:31:19 crc kubenswrapper[4956]: W0930 05:31:19.725578 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2f6ab45a_e36e_4328_a0ff_1a62a322133e.slice/crio-62923367a4fc01101376cda4bcb957a8c22b4ae628e7e902d2108ea918d6764f WatchSource:0}: Error finding container 62923367a4fc01101376cda4bcb957a8c22b4ae628e7e902d2108ea918d6764f: Status 404 returned error can't find the container with id 62923367a4fc01101376cda4bcb957a8c22b4ae628e7e902d2108ea918d6764f Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.843627 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lmln2"] Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.844967 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.847264 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.858209 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-catalog-content\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.858326 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js7v6\" (UniqueName: \"kubernetes.io/projected/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-kube-api-access-js7v6\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.858385 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-utilities\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.860836 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmln2"] Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.864284 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:19 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:19 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:19 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.864353 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.962790 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-catalog-content\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.962848 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js7v6\" (UniqueName: \"kubernetes.io/projected/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-kube-api-access-js7v6\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.962877 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-utilities\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.963355 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-utilities\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.963386 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-catalog-content\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:19 crc kubenswrapper[4956]: I0930 05:31:19.981794 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js7v6\" (UniqueName: \"kubernetes.io/projected/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-kube-api-access-js7v6\") pod \"redhat-marketplace-lmln2\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.160032 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.281210 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4qhbk"] Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.282530 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.293301 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qhbk"] Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.350974 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.363327 4956 generic.go:334] "Generic (PLEG): container finished" podID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerID="0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e" exitCode=0 Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.363394 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmr6c" event={"ID":"cb546d14-6d5d-4b44-b501-070ea2251e4b","Type":"ContainerDied","Data":"0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.368081 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.369785 4956 generic.go:334] "Generic (PLEG): container finished" podID="57114151-313a-4d50-bf79-8d92e5c50208" containerID="ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005" exitCode=0 Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.370024 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvx6l" event={"ID":"57114151-313a-4d50-bf79-8d92e5c50208","Type":"ContainerDied","Data":"ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.370059 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvx6l" event={"ID":"57114151-313a-4d50-bf79-8d92e5c50208","Type":"ContainerStarted","Data":"062b6795e822e480e62b62052b444fb5ab6a48f92e429ccc85c61b6d72b2aa55"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.388215 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mw2f\" (UniqueName: \"kubernetes.io/projected/fa47aa22-683e-413d-b004-054e1e331342-kube-api-access-4mw2f\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.388244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-catalog-content\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.388301 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-utilities\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.392236 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a4831c06836f45deadd3c9240d7535f660532b13a50521c18dc0813547d095f1"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.402619 4956 generic.go:334] "Generic (PLEG): container finished" podID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerID="a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21" exitCode=0 Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.402735 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nrh7" event={"ID":"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0","Type":"ContainerDied","Data":"a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.413444 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmln2"] Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.414631 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2f6ab45a-e36e-4328-a0ff-1a62a322133e","Type":"ContainerStarted","Data":"a9c7733225e19d55b4f40addb661f3a00cfdcc0bdee35caa5b82f54e2c1f2e43"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.414654 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2f6ab45a-e36e-4328-a0ff-1a62a322133e","Type":"ContainerStarted","Data":"62923367a4fc01101376cda4bcb957a8c22b4ae628e7e902d2108ea918d6764f"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.423060 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" event={"ID":"98c27d5f-42a5-4c1b-b5f4-49dcef583537","Type":"ContainerStarted","Data":"78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.423092 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" event={"ID":"98c27d5f-42a5-4c1b-b5f4-49dcef583537","Type":"ContainerStarted","Data":"a46d696672ca20a9f156bbc422bac2da96a228600d47bd4165a802152ffd9698"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.423204 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.453778 4956 generic.go:334] "Generic (PLEG): container finished" podID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerID="7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d" exitCode=0 Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.454550 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk8m8" event={"ID":"7b351e8a-d6e9-426c-a3fc-b486964f28c7","Type":"ContainerDied","Data":"7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d"} Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.476606 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.476588134 podStartE2EDuration="1.476588134s" podCreationTimestamp="2025-09-30 05:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:20.468817758 +0000 UTC m=+150.795938323" watchObservedRunningTime="2025-09-30 05:31:20.476588134 +0000 UTC m=+150.803708659" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.490182 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mw2f\" (UniqueName: \"kubernetes.io/projected/fa47aa22-683e-413d-b004-054e1e331342-kube-api-access-4mw2f\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.490232 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-catalog-content\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.490276 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-utilities\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.491302 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-utilities\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.491631 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-catalog-content\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.508635 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" podStartSLOduration=130.508619317 podStartE2EDuration="2m10.508619317s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:20.508007586 +0000 UTC m=+150.835128111" watchObservedRunningTime="2025-09-30 05:31:20.508619317 +0000 UTC m=+150.835739842" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.515523 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mw2f\" (UniqueName: \"kubernetes.io/projected/fa47aa22-683e-413d-b004-054e1e331342-kube-api-access-4mw2f\") pod \"redhat-marketplace-4qhbk\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.601032 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.796704 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qhbk"] Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.840187 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gv9cd"] Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.841671 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.843675 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.854847 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gv9cd"] Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.864873 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:20 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:20 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:20 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.864933 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.958735 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.959647 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.962512 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.962842 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.975663 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.998795 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-utilities\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.998844 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr8k5\" (UniqueName: \"kubernetes.io/projected/ac16a38f-5e71-4eee-8189-9361cfd19b84-kube-api-access-cr8k5\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:20 crc kubenswrapper[4956]: I0930 05:31:20.998887 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-catalog-content\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.100435 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d133d89e-16c0-45d9-9438-58398a70ac5d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d133d89e-16c0-45d9-9438-58398a70ac5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.100478 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-utilities\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.100497 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr8k5\" (UniqueName: \"kubernetes.io/projected/ac16a38f-5e71-4eee-8189-9361cfd19b84-kube-api-access-cr8k5\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.100523 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-catalog-content\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.100617 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d133d89e-16c0-45d9-9438-58398a70ac5d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d133d89e-16c0-45d9-9438-58398a70ac5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.100943 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-catalog-content\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.101020 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-utilities\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.128310 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr8k5\" (UniqueName: \"kubernetes.io/projected/ac16a38f-5e71-4eee-8189-9361cfd19b84-kube-api-access-cr8k5\") pod \"redhat-operators-gv9cd\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.173446 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.201720 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d133d89e-16c0-45d9-9438-58398a70ac5d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d133d89e-16c0-45d9-9438-58398a70ac5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.201845 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d133d89e-16c0-45d9-9438-58398a70ac5d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d133d89e-16c0-45d9-9438-58398a70ac5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.201937 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d133d89e-16c0-45d9-9438-58398a70ac5d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d133d89e-16c0-45d9-9438-58398a70ac5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.220901 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d133d89e-16c0-45d9-9438-58398a70ac5d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d133d89e-16c0-45d9-9438-58398a70ac5d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.242460 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xpf5s"] Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.243463 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.258970 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xpf5s"] Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.356196 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.404430 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-catalog-content\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.404488 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-utilities\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.404714 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8sm5\" (UniqueName: \"kubernetes.io/projected/ec84e71d-6ee8-4833-b914-7a7c103ac14e-kube-api-access-l8sm5\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.469437 4956 generic.go:334] "Generic (PLEG): container finished" podID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerID="7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498" exitCode=0 Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.469503 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmln2" event={"ID":"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e","Type":"ContainerDied","Data":"7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498"} Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.469533 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmln2" event={"ID":"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e","Type":"ContainerStarted","Data":"5708c9fa8f2e13922ff39820b54f6c7972267fb20038f3fc459eaf1066ba7f47"} Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.474243 4956 generic.go:334] "Generic (PLEG): container finished" podID="fa47aa22-683e-413d-b004-054e1e331342" containerID="faa680893543a0018e9fdb4f82ce0cf89c665ef521de50852a7ea840c5fc2824" exitCode=0 Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.474306 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qhbk" event={"ID":"fa47aa22-683e-413d-b004-054e1e331342","Type":"ContainerDied","Data":"faa680893543a0018e9fdb4f82ce0cf89c665ef521de50852a7ea840c5fc2824"} Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.474329 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qhbk" event={"ID":"fa47aa22-683e-413d-b004-054e1e331342","Type":"ContainerStarted","Data":"1e9b16c78992cc7e15ea2a2b9bbb9bcd97f83d02b9d1653707275278d54182f0"} Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.479557 4956 generic.go:334] "Generic (PLEG): container finished" podID="2f6ab45a-e36e-4328-a0ff-1a62a322133e" containerID="a9c7733225e19d55b4f40addb661f3a00cfdcc0bdee35caa5b82f54e2c1f2e43" exitCode=0 Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.480157 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2f6ab45a-e36e-4328-a0ff-1a62a322133e","Type":"ContainerDied","Data":"a9c7733225e19d55b4f40addb661f3a00cfdcc0bdee35caa5b82f54e2c1f2e43"} Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.509407 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8sm5\" (UniqueName: \"kubernetes.io/projected/ec84e71d-6ee8-4833-b914-7a7c103ac14e-kube-api-access-l8sm5\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.509490 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-catalog-content\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.509540 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-utilities\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.510022 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-utilities\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.511650 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-catalog-content\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.526407 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gv9cd"] Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.538268 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8sm5\" (UniqueName: \"kubernetes.io/projected/ec84e71d-6ee8-4833-b914-7a7c103ac14e-kube-api-access-l8sm5\") pod \"redhat-operators-xpf5s\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.562960 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.650837 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.675655 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.681663 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mrbck" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.742655 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:31:21 crc kubenswrapper[4956]: W0930 05:31:21.745648 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd133d89e_16c0_45d9_9438_58398a70ac5d.slice/crio-880f560954f50d9f491ce5f21902568d396bfa1daf1e1b89432d76585076d90f WatchSource:0}: Error finding container 880f560954f50d9f491ce5f21902568d396bfa1daf1e1b89432d76585076d90f: Status 404 returned error can't find the container with id 880f560954f50d9f491ce5f21902568d396bfa1daf1e1b89432d76585076d90f Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.761186 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.761218 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.775330 4956 patch_prober.go:28] interesting pod/console-f9d7485db-rtd48 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.775379 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rtd48" podUID="8f523f25-bc41-44dc-b311-bf6df1cbc2ee" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.861873 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:21 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:21 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:21 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:21 crc kubenswrapper[4956]: I0930 05:31:21.861936 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.304373 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xpf5s"] Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.489944 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpf5s" event={"ID":"ec84e71d-6ee8-4833-b914-7a7c103ac14e","Type":"ContainerStarted","Data":"5d7478f34db55b811e8b3b323d0564c57d9189ba5532ec95e7cf58eeb02bd3ea"} Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.492827 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv9cd" event={"ID":"ac16a38f-5e71-4eee-8189-9361cfd19b84","Type":"ContainerDied","Data":"eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345"} Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.492815 4956 generic.go:334] "Generic (PLEG): container finished" podID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerID="eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345" exitCode=0 Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.492967 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv9cd" event={"ID":"ac16a38f-5e71-4eee-8189-9361cfd19b84","Type":"ContainerStarted","Data":"7f7d15b4e3afd61a14f41d1d153dfd1b877ed015172340872a41281dfca8cd33"} Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.496284 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d133d89e-16c0-45d9-9438-58398a70ac5d","Type":"ContainerStarted","Data":"e5cf19ac7003f25008a885c5b86fa0b8fc195376a4b2d305e8ab2e0403492e15"} Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.496326 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d133d89e-16c0-45d9-9438-58398a70ac5d","Type":"ContainerStarted","Data":"880f560954f50d9f491ce5f21902568d396bfa1daf1e1b89432d76585076d90f"} Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.531782 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.5317647819999998 podStartE2EDuration="2.531764782s" podCreationTimestamp="2025-09-30 05:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:22.531275135 +0000 UTC m=+152.858395660" watchObservedRunningTime="2025-09-30 05:31:22.531764782 +0000 UTC m=+152.858885307" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.742596 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.773442 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-v25lf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.773495 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-v25lf" podUID="91203589-4d13-4929-8bd1-28f8a40e2b44" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.773620 4956 patch_prober.go:28] interesting pod/downloads-7954f5f757-v25lf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.773701 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v25lf" podUID="91203589-4d13-4929-8bd1-28f8a40e2b44" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.841944 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kube-api-access\") pod \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\" (UID: \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\") " Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.842006 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kubelet-dir\") pod \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\" (UID: \"2f6ab45a-e36e-4328-a0ff-1a62a322133e\") " Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.842400 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2f6ab45a-e36e-4328-a0ff-1a62a322133e" (UID: "2f6ab45a-e36e-4328-a0ff-1a62a322133e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.848959 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2f6ab45a-e36e-4328-a0ff-1a62a322133e" (UID: "2f6ab45a-e36e-4328-a0ff-1a62a322133e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.855219 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.859453 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:22 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:22 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:22 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.859530 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.944611 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:22 crc kubenswrapper[4956]: I0930 05:31:22.944644 4956 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2f6ab45a-e36e-4328-a0ff-1a62a322133e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.531574 4956 generic.go:334] "Generic (PLEG): container finished" podID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerID="5925d04f5b9328582ced2d915151c082fb88d34aa0145f6e549ac148f4941522" exitCode=0 Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.531651 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpf5s" event={"ID":"ec84e71d-6ee8-4833-b914-7a7c103ac14e","Type":"ContainerDied","Data":"5925d04f5b9328582ced2d915151c082fb88d34aa0145f6e549ac148f4941522"} Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.542864 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2f6ab45a-e36e-4328-a0ff-1a62a322133e","Type":"ContainerDied","Data":"62923367a4fc01101376cda4bcb957a8c22b4ae628e7e902d2108ea918d6764f"} Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.542874 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.543136 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62923367a4fc01101376cda4bcb957a8c22b4ae628e7e902d2108ea918d6764f" Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.559767 4956 generic.go:334] "Generic (PLEG): container finished" podID="d133d89e-16c0-45d9-9438-58398a70ac5d" containerID="e5cf19ac7003f25008a885c5b86fa0b8fc195376a4b2d305e8ab2e0403492e15" exitCode=0 Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.559801 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d133d89e-16c0-45d9-9438-58398a70ac5d","Type":"ContainerDied","Data":"e5cf19ac7003f25008a885c5b86fa0b8fc195376a4b2d305e8ab2e0403492e15"} Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.857653 4956 patch_prober.go:28] interesting pod/router-default-5444994796-xg6dj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 05:31:23 crc kubenswrapper[4956]: [-]has-synced failed: reason withheld Sep 30 05:31:23 crc kubenswrapper[4956]: [+]process-running ok Sep 30 05:31:23 crc kubenswrapper[4956]: healthz check failed Sep 30 05:31:23 crc kubenswrapper[4956]: I0930 05:31:23.857707 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg6dj" podUID="7aa8b675-10f2-4d8e-8e3a-79359f16d7bc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 05:31:24 crc kubenswrapper[4956]: I0930 05:31:24.758383 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w22x5" Sep 30 05:31:24 crc kubenswrapper[4956]: I0930 05:31:24.848005 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:24 crc kubenswrapper[4956]: I0930 05:31:24.859308 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:24 crc kubenswrapper[4956]: I0930 05:31:24.869229 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xg6dj" Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.007744 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d133d89e-16c0-45d9-9438-58398a70ac5d-kubelet-dir\") pod \"d133d89e-16c0-45d9-9438-58398a70ac5d\" (UID: \"d133d89e-16c0-45d9-9438-58398a70ac5d\") " Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.007863 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d133d89e-16c0-45d9-9438-58398a70ac5d-kube-api-access\") pod \"d133d89e-16c0-45d9-9438-58398a70ac5d\" (UID: \"d133d89e-16c0-45d9-9438-58398a70ac5d\") " Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.008088 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d133d89e-16c0-45d9-9438-58398a70ac5d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d133d89e-16c0-45d9-9438-58398a70ac5d" (UID: "d133d89e-16c0-45d9-9438-58398a70ac5d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.008904 4956 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d133d89e-16c0-45d9-9438-58398a70ac5d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.031232 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d133d89e-16c0-45d9-9438-58398a70ac5d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d133d89e-16c0-45d9-9438-58398a70ac5d" (UID: "d133d89e-16c0-45d9-9438-58398a70ac5d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.111068 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d133d89e-16c0-45d9-9438-58398a70ac5d-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.599601 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d133d89e-16c0-45d9-9438-58398a70ac5d","Type":"ContainerDied","Data":"880f560954f50d9f491ce5f21902568d396bfa1daf1e1b89432d76585076d90f"} Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.599656 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="880f560954f50d9f491ce5f21902568d396bfa1daf1e1b89432d76585076d90f" Sep 30 05:31:25 crc kubenswrapper[4956]: I0930 05:31:25.599764 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 05:31:31 crc kubenswrapper[4956]: I0930 05:31:31.789249 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:31 crc kubenswrapper[4956]: I0930 05:31:31.795642 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:31:32 crc kubenswrapper[4956]: I0930 05:31:32.221672 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:31:32 crc kubenswrapper[4956]: I0930 05:31:32.230429 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/184140db-c30d-4f88-89ff-b7aa2dcca3d1-metrics-certs\") pod \"network-metrics-daemon-ctwgh\" (UID: \"184140db-c30d-4f88-89ff-b7aa2dcca3d1\") " pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:31:32 crc kubenswrapper[4956]: I0930 05:31:32.400562 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ctwgh" Sep 30 05:31:32 crc kubenswrapper[4956]: I0930 05:31:32.778867 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-v25lf" Sep 30 05:31:39 crc kubenswrapper[4956]: I0930 05:31:39.076519 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.009614 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ctwgh"] Sep 30 05:31:43 crc kubenswrapper[4956]: W0930 05:31:43.017700 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184140db_c30d_4f88_89ff_b7aa2dcca3d1.slice/crio-26efa2a504a70e54c5edb720f7cceb6ee2f9ac0d3f658d592b19c8e03ffa100c WatchSource:0}: Error finding container 26efa2a504a70e54c5edb720f7cceb6ee2f9ac0d3f658d592b19c8e03ffa100c: Status 404 returned error can't find the container with id 26efa2a504a70e54c5edb720f7cceb6ee2f9ac0d3f658d592b19c8e03ffa100c Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.725654 4956 generic.go:334] "Generic (PLEG): container finished" podID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerID="8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b" exitCode=0 Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.725808 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv9cd" event={"ID":"ac16a38f-5e71-4eee-8189-9361cfd19b84","Type":"ContainerDied","Data":"8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.734461 4956 generic.go:334] "Generic (PLEG): container finished" podID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerID="d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696" exitCode=0 Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.734635 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmln2" event={"ID":"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e","Type":"ContainerDied","Data":"d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.743736 4956 generic.go:334] "Generic (PLEG): container finished" podID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerID="6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45" exitCode=0 Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.743826 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk8m8" event={"ID":"7b351e8a-d6e9-426c-a3fc-b486964f28c7","Type":"ContainerDied","Data":"6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.747596 4956 generic.go:334] "Generic (PLEG): container finished" podID="57114151-313a-4d50-bf79-8d92e5c50208" containerID="d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20" exitCode=0 Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.747717 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvx6l" event={"ID":"57114151-313a-4d50-bf79-8d92e5c50208","Type":"ContainerDied","Data":"d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.750070 4956 generic.go:334] "Generic (PLEG): container finished" podID="fa47aa22-683e-413d-b004-054e1e331342" containerID="40622a86a4db5c392ba91305a8636cbbd374655f65222aa536ff531790bff613" exitCode=0 Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.750164 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qhbk" event={"ID":"fa47aa22-683e-413d-b004-054e1e331342","Type":"ContainerDied","Data":"40622a86a4db5c392ba91305a8636cbbd374655f65222aa536ff531790bff613"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.752151 4956 generic.go:334] "Generic (PLEG): container finished" podID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerID="1d27fb5233014e2a072e2191760f7f54ea5396e9e6df2a6fed91994840cfb1d9" exitCode=0 Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.752255 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpf5s" event={"ID":"ec84e71d-6ee8-4833-b914-7a7c103ac14e","Type":"ContainerDied","Data":"1d27fb5233014e2a072e2191760f7f54ea5396e9e6df2a6fed91994840cfb1d9"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.754338 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" event={"ID":"184140db-c30d-4f88-89ff-b7aa2dcca3d1","Type":"ContainerStarted","Data":"310361c055ad47790fda04cc7d75984368ace18d9936bfc787bc48fff706ca30"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.754390 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" event={"ID":"184140db-c30d-4f88-89ff-b7aa2dcca3d1","Type":"ContainerStarted","Data":"26efa2a504a70e54c5edb720f7cceb6ee2f9ac0d3f658d592b19c8e03ffa100c"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.756088 4956 generic.go:334] "Generic (PLEG): container finished" podID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerID="a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf" exitCode=0 Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.756179 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmr6c" event={"ID":"cb546d14-6d5d-4b44-b501-070ea2251e4b","Type":"ContainerDied","Data":"a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf"} Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.758357 4956 generic.go:334] "Generic (PLEG): container finished" podID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerID="3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333" exitCode=0 Sep 30 05:31:43 crc kubenswrapper[4956]: I0930 05:31:43.758411 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nrh7" event={"ID":"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0","Type":"ContainerDied","Data":"3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.765624 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpf5s" event={"ID":"ec84e71d-6ee8-4833-b914-7a7c103ac14e","Type":"ContainerStarted","Data":"59f400420f0850f2f145796ae7dd84b85147a5549b800c1f05d8d1a95f29cfb8"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.767741 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv9cd" event={"ID":"ac16a38f-5e71-4eee-8189-9361cfd19b84","Type":"ContainerStarted","Data":"795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.770482 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nrh7" event={"ID":"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0","Type":"ContainerStarted","Data":"62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.772379 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ctwgh" event={"ID":"184140db-c30d-4f88-89ff-b7aa2dcca3d1","Type":"ContainerStarted","Data":"5341f0aeeebbf0b0efc28634487524681e8b07bed8eb4468b0215cff68224cf0"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.775054 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk8m8" event={"ID":"7b351e8a-d6e9-426c-a3fc-b486964f28c7","Type":"ContainerStarted","Data":"d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.777074 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmr6c" event={"ID":"cb546d14-6d5d-4b44-b501-070ea2251e4b","Type":"ContainerStarted","Data":"49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.778918 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvx6l" event={"ID":"57114151-313a-4d50-bf79-8d92e5c50208","Type":"ContainerStarted","Data":"6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.781510 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmln2" event={"ID":"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e","Type":"ContainerStarted","Data":"8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.783156 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xpf5s" podStartSLOduration=2.932983523 podStartE2EDuration="23.783146341s" podCreationTimestamp="2025-09-30 05:31:21 +0000 UTC" firstStartedPulling="2025-09-30 05:31:23.538816155 +0000 UTC m=+153.865936680" lastFinishedPulling="2025-09-30 05:31:44.388978973 +0000 UTC m=+174.716099498" observedRunningTime="2025-09-30 05:31:44.77992377 +0000 UTC m=+175.107044285" watchObservedRunningTime="2025-09-30 05:31:44.783146341 +0000 UTC m=+175.110266866" Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.785489 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qhbk" event={"ID":"fa47aa22-683e-413d-b004-054e1e331342","Type":"ContainerStarted","Data":"6b6ad14a8988766d0931767cf4700e7f4614f74a204d152b438f46d71fa7a386"} Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.803179 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mmr6c" podStartSLOduration=3.6001888539999998 podStartE2EDuration="27.803162344s" podCreationTimestamp="2025-09-30 05:31:17 +0000 UTC" firstStartedPulling="2025-09-30 05:31:20.367788549 +0000 UTC m=+150.694909074" lastFinishedPulling="2025-09-30 05:31:44.570762019 +0000 UTC m=+174.897882564" observedRunningTime="2025-09-30 05:31:44.801444805 +0000 UTC m=+175.128565350" watchObservedRunningTime="2025-09-30 05:31:44.803162344 +0000 UTC m=+175.130282859" Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.819043 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gv9cd" podStartSLOduration=2.875620082 podStartE2EDuration="24.819025716s" podCreationTimestamp="2025-09-30 05:31:20 +0000 UTC" firstStartedPulling="2025-09-30 05:31:22.494189789 +0000 UTC m=+152.821310314" lastFinishedPulling="2025-09-30 05:31:44.437595413 +0000 UTC m=+174.764715948" observedRunningTime="2025-09-30 05:31:44.817873716 +0000 UTC m=+175.144994251" watchObservedRunningTime="2025-09-30 05:31:44.819025716 +0000 UTC m=+175.146146241" Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.837494 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bvx6l" podStartSLOduration=2.669292722 podStartE2EDuration="26.837480265s" podCreationTimestamp="2025-09-30 05:31:18 +0000 UTC" firstStartedPulling="2025-09-30 05:31:20.377988517 +0000 UTC m=+150.705109042" lastFinishedPulling="2025-09-30 05:31:44.54617604 +0000 UTC m=+174.873296585" observedRunningTime="2025-09-30 05:31:44.836304776 +0000 UTC m=+175.163425301" watchObservedRunningTime="2025-09-30 05:31:44.837480265 +0000 UTC m=+175.164600790" Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.855589 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4nrh7" podStartSLOduration=3.90406739 podStartE2EDuration="27.855575154s" podCreationTimestamp="2025-09-30 05:31:17 +0000 UTC" firstStartedPulling="2025-09-30 05:31:20.411681728 +0000 UTC m=+150.738802253" lastFinishedPulling="2025-09-30 05:31:44.363189472 +0000 UTC m=+174.690310017" observedRunningTime="2025-09-30 05:31:44.854430424 +0000 UTC m=+175.181550959" watchObservedRunningTime="2025-09-30 05:31:44.855575154 +0000 UTC m=+175.182695679" Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.873413 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ctwgh" podStartSLOduration=154.873396362 podStartE2EDuration="2m34.873396362s" podCreationTimestamp="2025-09-30 05:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:31:44.871815278 +0000 UTC m=+175.198935803" watchObservedRunningTime="2025-09-30 05:31:44.873396362 +0000 UTC m=+175.200516887" Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.892937 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rk8m8" podStartSLOduration=2.876796817 podStartE2EDuration="26.892921699s" podCreationTimestamp="2025-09-30 05:31:18 +0000 UTC" firstStartedPulling="2025-09-30 05:31:20.456549619 +0000 UTC m=+150.783670144" lastFinishedPulling="2025-09-30 05:31:44.472674501 +0000 UTC m=+174.799795026" observedRunningTime="2025-09-30 05:31:44.890502006 +0000 UTC m=+175.217622551" watchObservedRunningTime="2025-09-30 05:31:44.892921699 +0000 UTC m=+175.220042214" Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.918987 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lmln2" podStartSLOduration=3.083119084 podStartE2EDuration="25.918971708s" podCreationTimestamp="2025-09-30 05:31:19 +0000 UTC" firstStartedPulling="2025-09-30 05:31:21.472595599 +0000 UTC m=+151.799716124" lastFinishedPulling="2025-09-30 05:31:44.308448213 +0000 UTC m=+174.635568748" observedRunningTime="2025-09-30 05:31:44.913928406 +0000 UTC m=+175.241048931" watchObservedRunningTime="2025-09-30 05:31:44.918971708 +0000 UTC m=+175.246092233" Sep 30 05:31:44 crc kubenswrapper[4956]: I0930 05:31:44.935188 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4qhbk" podStartSLOduration=2.117355823 podStartE2EDuration="24.935169781s" podCreationTimestamp="2025-09-30 05:31:20 +0000 UTC" firstStartedPulling="2025-09-30 05:31:21.47759417 +0000 UTC m=+151.804714695" lastFinishedPulling="2025-09-30 05:31:44.295408128 +0000 UTC m=+174.622528653" observedRunningTime="2025-09-30 05:31:44.934212238 +0000 UTC m=+175.261332763" watchObservedRunningTime="2025-09-30 05:31:44.935169781 +0000 UTC m=+175.262290296" Sep 30 05:31:47 crc kubenswrapper[4956]: I0930 05:31:47.988135 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:47 crc kubenswrapper[4956]: I0930 05:31:47.988607 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.073244 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.073290 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.184779 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.184861 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.235241 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.236274 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.459918 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.459977 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.503714 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.638839 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.638888 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:48 crc kubenswrapper[4956]: I0930 05:31:48.681019 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:31:50 crc kubenswrapper[4956]: I0930 05:31:50.160241 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:50 crc kubenswrapper[4956]: I0930 05:31:50.160588 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:50 crc kubenswrapper[4956]: I0930 05:31:50.218500 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:50 crc kubenswrapper[4956]: I0930 05:31:50.602029 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:50 crc kubenswrapper[4956]: I0930 05:31:50.602086 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:50 crc kubenswrapper[4956]: I0930 05:31:50.643575 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:50 crc kubenswrapper[4956]: I0930 05:31:50.870398 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:31:50 crc kubenswrapper[4956]: I0930 05:31:50.895721 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:51 crc kubenswrapper[4956]: I0930 05:31:51.173987 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:51 crc kubenswrapper[4956]: I0930 05:31:51.174077 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:51 crc kubenswrapper[4956]: I0930 05:31:51.219671 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:51 crc kubenswrapper[4956]: I0930 05:31:51.564093 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:51 crc kubenswrapper[4956]: I0930 05:31:51.564212 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:51 crc kubenswrapper[4956]: I0930 05:31:51.624572 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:51 crc kubenswrapper[4956]: I0930 05:31:51.870566 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:51 crc kubenswrapper[4956]: I0930 05:31:51.872161 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:31:52 crc kubenswrapper[4956]: I0930 05:31:52.299096 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j9dnt" Sep 30 05:31:53 crc kubenswrapper[4956]: I0930 05:31:53.987840 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qhbk"] Sep 30 05:31:53 crc kubenswrapper[4956]: I0930 05:31:53.989809 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4qhbk" podUID="fa47aa22-683e-413d-b004-054e1e331342" containerName="registry-server" containerID="cri-o://6b6ad14a8988766d0931767cf4700e7f4614f74a204d152b438f46d71fa7a386" gracePeriod=2 Sep 30 05:31:54 crc kubenswrapper[4956]: I0930 05:31:54.190349 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xpf5s"] Sep 30 05:31:54 crc kubenswrapper[4956]: I0930 05:31:54.190666 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xpf5s" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerName="registry-server" containerID="cri-o://59f400420f0850f2f145796ae7dd84b85147a5549b800c1f05d8d1a95f29cfb8" gracePeriod=2 Sep 30 05:31:54 crc kubenswrapper[4956]: I0930 05:31:54.868393 4956 generic.go:334] "Generic (PLEG): container finished" podID="fa47aa22-683e-413d-b004-054e1e331342" containerID="6b6ad14a8988766d0931767cf4700e7f4614f74a204d152b438f46d71fa7a386" exitCode=0 Sep 30 05:31:54 crc kubenswrapper[4956]: I0930 05:31:54.868571 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qhbk" event={"ID":"fa47aa22-683e-413d-b004-054e1e331342","Type":"ContainerDied","Data":"6b6ad14a8988766d0931767cf4700e7f4614f74a204d152b438f46d71fa7a386"} Sep 30 05:31:54 crc kubenswrapper[4956]: I0930 05:31:54.873391 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpf5s" event={"ID":"ec84e71d-6ee8-4833-b914-7a7c103ac14e","Type":"ContainerDied","Data":"59f400420f0850f2f145796ae7dd84b85147a5549b800c1f05d8d1a95f29cfb8"} Sep 30 05:31:54 crc kubenswrapper[4956]: I0930 05:31:54.873326 4956 generic.go:334] "Generic (PLEG): container finished" podID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerID="59f400420f0850f2f145796ae7dd84b85147a5549b800c1f05d8d1a95f29cfb8" exitCode=0 Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.293528 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.379996 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.480750 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-utilities\") pod \"fa47aa22-683e-413d-b004-054e1e331342\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.480863 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mw2f\" (UniqueName: \"kubernetes.io/projected/fa47aa22-683e-413d-b004-054e1e331342-kube-api-access-4mw2f\") pod \"fa47aa22-683e-413d-b004-054e1e331342\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.480981 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-catalog-content\") pod \"fa47aa22-683e-413d-b004-054e1e331342\" (UID: \"fa47aa22-683e-413d-b004-054e1e331342\") " Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.481016 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-utilities\") pod \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.481048 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8sm5\" (UniqueName: \"kubernetes.io/projected/ec84e71d-6ee8-4833-b914-7a7c103ac14e-kube-api-access-l8sm5\") pod \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.481950 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-utilities" (OuterVolumeSpecName: "utilities") pod "ec84e71d-6ee8-4833-b914-7a7c103ac14e" (UID: "ec84e71d-6ee8-4833-b914-7a7c103ac14e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.482032 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-utilities" (OuterVolumeSpecName: "utilities") pod "fa47aa22-683e-413d-b004-054e1e331342" (UID: "fa47aa22-683e-413d-b004-054e1e331342"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.482297 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-catalog-content\") pod \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\" (UID: \"ec84e71d-6ee8-4833-b914-7a7c103ac14e\") " Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.482537 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.482563 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.488084 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec84e71d-6ee8-4833-b914-7a7c103ac14e-kube-api-access-l8sm5" (OuterVolumeSpecName: "kube-api-access-l8sm5") pod "ec84e71d-6ee8-4833-b914-7a7c103ac14e" (UID: "ec84e71d-6ee8-4833-b914-7a7c103ac14e"). InnerVolumeSpecName "kube-api-access-l8sm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.488802 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa47aa22-683e-413d-b004-054e1e331342-kube-api-access-4mw2f" (OuterVolumeSpecName: "kube-api-access-4mw2f") pod "fa47aa22-683e-413d-b004-054e1e331342" (UID: "fa47aa22-683e-413d-b004-054e1e331342"). InnerVolumeSpecName "kube-api-access-4mw2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.495887 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa47aa22-683e-413d-b004-054e1e331342" (UID: "fa47aa22-683e-413d-b004-054e1e331342"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.566622 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec84e71d-6ee8-4833-b914-7a7c103ac14e" (UID: "ec84e71d-6ee8-4833-b914-7a7c103ac14e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.583599 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa47aa22-683e-413d-b004-054e1e331342-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.583654 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8sm5\" (UniqueName: \"kubernetes.io/projected/ec84e71d-6ee8-4833-b914-7a7c103ac14e-kube-api-access-l8sm5\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.583667 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec84e71d-6ee8-4833-b914-7a7c103ac14e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.583678 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mw2f\" (UniqueName: \"kubernetes.io/projected/fa47aa22-683e-413d-b004-054e1e331342-kube-api-access-4mw2f\") on node \"crc\" DevicePath \"\"" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.883403 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qhbk" event={"ID":"fa47aa22-683e-413d-b004-054e1e331342","Type":"ContainerDied","Data":"1e9b16c78992cc7e15ea2a2b9bbb9bcd97f83d02b9d1653707275278d54182f0"} Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.883479 4956 scope.go:117] "RemoveContainer" containerID="6b6ad14a8988766d0931767cf4700e7f4614f74a204d152b438f46d71fa7a386" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.883656 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qhbk" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.891385 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpf5s" event={"ID":"ec84e71d-6ee8-4833-b914-7a7c103ac14e","Type":"ContainerDied","Data":"5d7478f34db55b811e8b3b323d0564c57d9189ba5532ec95e7cf58eeb02bd3ea"} Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.891482 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpf5s" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.915401 4956 scope.go:117] "RemoveContainer" containerID="40622a86a4db5c392ba91305a8636cbbd374655f65222aa536ff531790bff613" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.936039 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xpf5s"] Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.943240 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xpf5s"] Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.955912 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qhbk"] Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.961968 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qhbk"] Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.971685 4956 scope.go:117] "RemoveContainer" containerID="faa680893543a0018e9fdb4f82ce0cf89c665ef521de50852a7ea840c5fc2824" Sep 30 05:31:55 crc kubenswrapper[4956]: I0930 05:31:55.999277 4956 scope.go:117] "RemoveContainer" containerID="59f400420f0850f2f145796ae7dd84b85147a5549b800c1f05d8d1a95f29cfb8" Sep 30 05:31:56 crc kubenswrapper[4956]: I0930 05:31:56.020172 4956 scope.go:117] "RemoveContainer" containerID="1d27fb5233014e2a072e2191760f7f54ea5396e9e6df2a6fed91994840cfb1d9" Sep 30 05:31:56 crc kubenswrapper[4956]: I0930 05:31:56.041958 4956 scope.go:117] "RemoveContainer" containerID="5925d04f5b9328582ced2d915151c082fb88d34aa0145f6e549ac148f4941522" Sep 30 05:31:56 crc kubenswrapper[4956]: I0930 05:31:56.350999 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" path="/var/lib/kubelet/pods/ec84e71d-6ee8-4833-b914-7a7c103ac14e/volumes" Sep 30 05:31:56 crc kubenswrapper[4956]: I0930 05:31:56.352633 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa47aa22-683e-413d-b004-054e1e331342" path="/var/lib/kubelet/pods/fa47aa22-683e-413d-b004-054e1e331342/volumes" Sep 30 05:31:57 crc kubenswrapper[4956]: I0930 05:31:57.688210 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 05:31:58 crc kubenswrapper[4956]: I0930 05:31:58.028755 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:31:58 crc kubenswrapper[4956]: I0930 05:31:58.233626 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:31:58 crc kubenswrapper[4956]: I0930 05:31:58.508076 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:31:58 crc kubenswrapper[4956]: I0930 05:31:58.672047 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.405752 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rk8m8"] Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.406464 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rk8m8" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerName="registry-server" containerID="cri-o://d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a" gracePeriod=2 Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.587143 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bvx6l"] Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.587346 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bvx6l" podUID="57114151-313a-4d50-bf79-8d92e5c50208" containerName="registry-server" containerID="cri-o://6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32" gracePeriod=2 Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.797922 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.851805 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kkpp\" (UniqueName: \"kubernetes.io/projected/7b351e8a-d6e9-426c-a3fc-b486964f28c7-kube-api-access-7kkpp\") pod \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.851864 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-catalog-content\") pod \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.851920 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-utilities\") pod \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\" (UID: \"7b351e8a-d6e9-426c-a3fc-b486964f28c7\") " Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.852670 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-utilities" (OuterVolumeSpecName: "utilities") pod "7b351e8a-d6e9-426c-a3fc-b486964f28c7" (UID: "7b351e8a-d6e9-426c-a3fc-b486964f28c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.862301 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b351e8a-d6e9-426c-a3fc-b486964f28c7-kube-api-access-7kkpp" (OuterVolumeSpecName: "kube-api-access-7kkpp") pod "7b351e8a-d6e9-426c-a3fc-b486964f28c7" (UID: "7b351e8a-d6e9-426c-a3fc-b486964f28c7"). InnerVolumeSpecName "kube-api-access-7kkpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.878709 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.899194 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b351e8a-d6e9-426c-a3fc-b486964f28c7" (UID: "7b351e8a-d6e9-426c-a3fc-b486964f28c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.920959 4956 generic.go:334] "Generic (PLEG): container finished" podID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerID="d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a" exitCode=0 Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.921015 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk8m8" event={"ID":"7b351e8a-d6e9-426c-a3fc-b486964f28c7","Type":"ContainerDied","Data":"d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a"} Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.921041 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rk8m8" event={"ID":"7b351e8a-d6e9-426c-a3fc-b486964f28c7","Type":"ContainerDied","Data":"30ae1bd6981b764824ec8096a1589e444907bd9ab4e4f9821042c48a6d3207d9"} Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.921036 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rk8m8" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.921140 4956 scope.go:117] "RemoveContainer" containerID="d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.932600 4956 generic.go:334] "Generic (PLEG): container finished" podID="57114151-313a-4d50-bf79-8d92e5c50208" containerID="6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32" exitCode=0 Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.932649 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvx6l" event={"ID":"57114151-313a-4d50-bf79-8d92e5c50208","Type":"ContainerDied","Data":"6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32"} Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.932679 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bvx6l" event={"ID":"57114151-313a-4d50-bf79-8d92e5c50208","Type":"ContainerDied","Data":"062b6795e822e480e62b62052b444fb5ab6a48f92e429ccc85c61b6d72b2aa55"} Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.932706 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bvx6l" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.938341 4956 scope.go:117] "RemoveContainer" containerID="6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.953977 4956 scope.go:117] "RemoveContainer" containerID="7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.954366 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-utilities\") pod \"57114151-313a-4d50-bf79-8d92e5c50208\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.954398 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-catalog-content\") pod \"57114151-313a-4d50-bf79-8d92e5c50208\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.954436 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gs8h\" (UniqueName: \"kubernetes.io/projected/57114151-313a-4d50-bf79-8d92e5c50208-kube-api-access-2gs8h\") pod \"57114151-313a-4d50-bf79-8d92e5c50208\" (UID: \"57114151-313a-4d50-bf79-8d92e5c50208\") " Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.954654 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kkpp\" (UniqueName: \"kubernetes.io/projected/7b351e8a-d6e9-426c-a3fc-b486964f28c7-kube-api-access-7kkpp\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.954669 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.954680 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b351e8a-d6e9-426c-a3fc-b486964f28c7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.955692 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-utilities" (OuterVolumeSpecName: "utilities") pod "57114151-313a-4d50-bf79-8d92e5c50208" (UID: "57114151-313a-4d50-bf79-8d92e5c50208"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.956261 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rk8m8"] Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.958165 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57114151-313a-4d50-bf79-8d92e5c50208-kube-api-access-2gs8h" (OuterVolumeSpecName: "kube-api-access-2gs8h") pod "57114151-313a-4d50-bf79-8d92e5c50208" (UID: "57114151-313a-4d50-bf79-8d92e5c50208"). InnerVolumeSpecName "kube-api-access-2gs8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.959028 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rk8m8"] Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.966602 4956 scope.go:117] "RemoveContainer" containerID="d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a" Sep 30 05:32:00 crc kubenswrapper[4956]: E0930 05:32:00.967028 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a\": container with ID starting with d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a not found: ID does not exist" containerID="d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.967060 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a"} err="failed to get container status \"d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a\": rpc error: code = NotFound desc = could not find container \"d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a\": container with ID starting with d0b8f6e00a45a152979f7cc7f78205797fa8f3076212a05c809a63a4e541cd4a not found: ID does not exist" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.967099 4956 scope.go:117] "RemoveContainer" containerID="6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45" Sep 30 05:32:00 crc kubenswrapper[4956]: E0930 05:32:00.967573 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45\": container with ID starting with 6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45 not found: ID does not exist" containerID="6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.967665 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45"} err="failed to get container status \"6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45\": rpc error: code = NotFound desc = could not find container \"6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45\": container with ID starting with 6dd9cc97d1df93aae83a24e14daa92c05bd6404bd2bf17514e2b61f1addefb45 not found: ID does not exist" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.967737 4956 scope.go:117] "RemoveContainer" containerID="7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d" Sep 30 05:32:00 crc kubenswrapper[4956]: E0930 05:32:00.968067 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d\": container with ID starting with 7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d not found: ID does not exist" containerID="7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.968094 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d"} err="failed to get container status \"7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d\": rpc error: code = NotFound desc = could not find container \"7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d\": container with ID starting with 7bf08645b6b589c613417ea8e174b48a5635bc3d927db226a2aa877dc298e75d not found: ID does not exist" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.968146 4956 scope.go:117] "RemoveContainer" containerID="6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.979089 4956 scope.go:117] "RemoveContainer" containerID="d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20" Sep 30 05:32:00 crc kubenswrapper[4956]: I0930 05:32:00.995427 4956 scope.go:117] "RemoveContainer" containerID="ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.002542 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57114151-313a-4d50-bf79-8d92e5c50208" (UID: "57114151-313a-4d50-bf79-8d92e5c50208"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.005984 4956 scope.go:117] "RemoveContainer" containerID="6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32" Sep 30 05:32:01 crc kubenswrapper[4956]: E0930 05:32:01.006492 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32\": container with ID starting with 6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32 not found: ID does not exist" containerID="6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.006564 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32"} err="failed to get container status \"6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32\": rpc error: code = NotFound desc = could not find container \"6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32\": container with ID starting with 6ea69c2994071c6ba33ce2c843e8b10064b7137225c9bab88cac92d5efc3ac32 not found: ID does not exist" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.006593 4956 scope.go:117] "RemoveContainer" containerID="d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20" Sep 30 05:32:01 crc kubenswrapper[4956]: E0930 05:32:01.006835 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20\": container with ID starting with d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20 not found: ID does not exist" containerID="d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.006925 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20"} err="failed to get container status \"d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20\": rpc error: code = NotFound desc = could not find container \"d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20\": container with ID starting with d1bc82e50ee3190325a275e065ba8480aef6da91ed6dfed9ce57cd406e1bec20 not found: ID does not exist" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.007005 4956 scope.go:117] "RemoveContainer" containerID="ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005" Sep 30 05:32:01 crc kubenswrapper[4956]: E0930 05:32:01.007352 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005\": container with ID starting with ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005 not found: ID does not exist" containerID="ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.007404 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005"} err="failed to get container status \"ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005\": rpc error: code = NotFound desc = could not find container \"ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005\": container with ID starting with ab7f4c4e6791fcb8454a2fd80da3004b24b2acee6ec02c67fdddc50bc9367005 not found: ID does not exist" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.056133 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.056165 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57114151-313a-4d50-bf79-8d92e5c50208-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.056200 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gs8h\" (UniqueName: \"kubernetes.io/projected/57114151-313a-4d50-bf79-8d92e5c50208-kube-api-access-2gs8h\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.270028 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bvx6l"] Sep 30 05:32:01 crc kubenswrapper[4956]: I0930 05:32:01.275147 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bvx6l"] Sep 30 05:32:02 crc kubenswrapper[4956]: I0930 05:32:02.347329 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57114151-313a-4d50-bf79-8d92e5c50208" path="/var/lib/kubelet/pods/57114151-313a-4d50-bf79-8d92e5c50208/volumes" Sep 30 05:32:02 crc kubenswrapper[4956]: I0930 05:32:02.348398 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" path="/var/lib/kubelet/pods/7b351e8a-d6e9-426c-a3fc-b486964f28c7/volumes" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.289199 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9c8r"] Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.290309 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" podUID="39bae241-73b1-4078-9861-309c762b38b5" containerName="controller-manager" containerID="cri-o://cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d" gracePeriod=30 Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.393209 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2"] Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.393801 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" podUID="050f9843-c228-4681-96e9-8649f7eff6fc" containerName="route-controller-manager" containerID="cri-o://c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3" gracePeriod=30 Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.623739 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.701363 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.755295 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-config\") pod \"39bae241-73b1-4078-9861-309c762b38b5\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.755620 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-client-ca\") pod \"39bae241-73b1-4078-9861-309c762b38b5\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.755652 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l997k\" (UniqueName: \"kubernetes.io/projected/39bae241-73b1-4078-9861-309c762b38b5-kube-api-access-l997k\") pod \"39bae241-73b1-4078-9861-309c762b38b5\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.755731 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39bae241-73b1-4078-9861-309c762b38b5-serving-cert\") pod \"39bae241-73b1-4078-9861-309c762b38b5\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.755759 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-proxy-ca-bundles\") pod \"39bae241-73b1-4078-9861-309c762b38b5\" (UID: \"39bae241-73b1-4078-9861-309c762b38b5\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.756400 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-client-ca" (OuterVolumeSpecName: "client-ca") pod "39bae241-73b1-4078-9861-309c762b38b5" (UID: "39bae241-73b1-4078-9861-309c762b38b5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.756418 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-config" (OuterVolumeSpecName: "config") pod "39bae241-73b1-4078-9861-309c762b38b5" (UID: "39bae241-73b1-4078-9861-309c762b38b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.757665 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "39bae241-73b1-4078-9861-309c762b38b5" (UID: "39bae241-73b1-4078-9861-309c762b38b5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.762480 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39bae241-73b1-4078-9861-309c762b38b5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "39bae241-73b1-4078-9861-309c762b38b5" (UID: "39bae241-73b1-4078-9861-309c762b38b5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.762596 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39bae241-73b1-4078-9861-309c762b38b5-kube-api-access-l997k" (OuterVolumeSpecName: "kube-api-access-l997k") pod "39bae241-73b1-4078-9861-309c762b38b5" (UID: "39bae241-73b1-4078-9861-309c762b38b5"). InnerVolumeSpecName "kube-api-access-l997k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.856895 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-config\") pod \"050f9843-c228-4681-96e9-8649f7eff6fc\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.856958 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8vpq\" (UniqueName: \"kubernetes.io/projected/050f9843-c228-4681-96e9-8649f7eff6fc-kube-api-access-k8vpq\") pod \"050f9843-c228-4681-96e9-8649f7eff6fc\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.857081 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050f9843-c228-4681-96e9-8649f7eff6fc-serving-cert\") pod \"050f9843-c228-4681-96e9-8649f7eff6fc\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.857140 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-client-ca\") pod \"050f9843-c228-4681-96e9-8649f7eff6fc\" (UID: \"050f9843-c228-4681-96e9-8649f7eff6fc\") " Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.857472 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39bae241-73b1-4078-9861-309c762b38b5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.857505 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.857524 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.857541 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39bae241-73b1-4078-9861-309c762b38b5-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.857558 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l997k\" (UniqueName: \"kubernetes.io/projected/39bae241-73b1-4078-9861-309c762b38b5-kube-api-access-l997k\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.858111 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-config" (OuterVolumeSpecName: "config") pod "050f9843-c228-4681-96e9-8649f7eff6fc" (UID: "050f9843-c228-4681-96e9-8649f7eff6fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.858339 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "050f9843-c228-4681-96e9-8649f7eff6fc" (UID: "050f9843-c228-4681-96e9-8649f7eff6fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.860776 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050f9843-c228-4681-96e9-8649f7eff6fc-kube-api-access-k8vpq" (OuterVolumeSpecName: "kube-api-access-k8vpq") pod "050f9843-c228-4681-96e9-8649f7eff6fc" (UID: "050f9843-c228-4681-96e9-8649f7eff6fc"). InnerVolumeSpecName "kube-api-access-k8vpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.861669 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050f9843-c228-4681-96e9-8649f7eff6fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "050f9843-c228-4681-96e9-8649f7eff6fc" (UID: "050f9843-c228-4681-96e9-8649f7eff6fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.959098 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.959143 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8vpq\" (UniqueName: \"kubernetes.io/projected/050f9843-c228-4681-96e9-8649f7eff6fc-kube-api-access-k8vpq\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.959155 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/050f9843-c228-4681-96e9-8649f7eff6fc-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:16 crc kubenswrapper[4956]: I0930 05:32:16.959164 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/050f9843-c228-4681-96e9-8649f7eff6fc-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.006633 4956 generic.go:334] "Generic (PLEG): container finished" podID="050f9843-c228-4681-96e9-8649f7eff6fc" containerID="c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3" exitCode=0 Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.006709 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" event={"ID":"050f9843-c228-4681-96e9-8649f7eff6fc","Type":"ContainerDied","Data":"c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3"} Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.006735 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" event={"ID":"050f9843-c228-4681-96e9-8649f7eff6fc","Type":"ContainerDied","Data":"2db853daeb8aef784ed0a57451fa2635caecedb67f613c991dcbe66863168131"} Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.006750 4956 scope.go:117] "RemoveContainer" containerID="c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.006836 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.015410 4956 generic.go:334] "Generic (PLEG): container finished" podID="39bae241-73b1-4078-9861-309c762b38b5" containerID="cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d" exitCode=0 Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.015456 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" event={"ID":"39bae241-73b1-4078-9861-309c762b38b5","Type":"ContainerDied","Data":"cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d"} Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.015486 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" event={"ID":"39bae241-73b1-4078-9861-309c762b38b5","Type":"ContainerDied","Data":"49d9abbcf43428b3d6b7abc41c0dd46dd2ea1b48723520a0538356ed43dffbe0"} Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.015567 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9c8r" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.026789 4956 scope.go:117] "RemoveContainer" containerID="c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.027352 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3\": container with ID starting with c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3 not found: ID does not exist" containerID="c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.027393 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3"} err="failed to get container status \"c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3\": rpc error: code = NotFound desc = could not find container \"c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3\": container with ID starting with c64865e4038add83e6c6f4495a87508e017af0038f1f39304e21c8a0c8ac38c3 not found: ID does not exist" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.027416 4956 scope.go:117] "RemoveContainer" containerID="cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.050091 4956 scope.go:117] "RemoveContainer" containerID="cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.053929 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d\": container with ID starting with cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d not found: ID does not exist" containerID="cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.053979 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d"} err="failed to get container status \"cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d\": rpc error: code = NotFound desc = could not find container \"cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d\": container with ID starting with cd9e55eeae0300222f58ad3259736d631c31824bcbe41253687c7c8f8ac14a9d not found: ID does not exist" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.056275 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9c8r"] Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.057595 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9c8r"] Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.062701 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2"] Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.068965 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5khh2"] Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701157 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-695fdf4d89-ltt82"] Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701473 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57114151-313a-4d50-bf79-8d92e5c50208" containerName="extract-content" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701488 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57114151-313a-4d50-bf79-8d92e5c50208" containerName="extract-content" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701498 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57114151-313a-4d50-bf79-8d92e5c50208" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701506 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57114151-313a-4d50-bf79-8d92e5c50208" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701520 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39bae241-73b1-4078-9861-309c762b38b5" containerName="controller-manager" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701529 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="39bae241-73b1-4078-9861-309c762b38b5" containerName="controller-manager" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701543 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050f9843-c228-4681-96e9-8649f7eff6fc" containerName="route-controller-manager" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701550 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="050f9843-c228-4681-96e9-8649f7eff6fc" containerName="route-controller-manager" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701560 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa47aa22-683e-413d-b004-054e1e331342" containerName="extract-content" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701569 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa47aa22-683e-413d-b004-054e1e331342" containerName="extract-content" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701579 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerName="extract-utilities" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701586 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerName="extract-utilities" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701599 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa47aa22-683e-413d-b004-054e1e331342" containerName="extract-utilities" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701606 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa47aa22-683e-413d-b004-054e1e331342" containerName="extract-utilities" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701618 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d133d89e-16c0-45d9-9438-58398a70ac5d" containerName="pruner" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701625 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d133d89e-16c0-45d9-9438-58398a70ac5d" containerName="pruner" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701635 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f6ab45a-e36e-4328-a0ff-1a62a322133e" containerName="pruner" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701642 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f6ab45a-e36e-4328-a0ff-1a62a322133e" containerName="pruner" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701653 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa47aa22-683e-413d-b004-054e1e331342" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701661 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa47aa22-683e-413d-b004-054e1e331342" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701675 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerName="extract-content" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701683 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerName="extract-content" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701690 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57114151-313a-4d50-bf79-8d92e5c50208" containerName="extract-utilities" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701697 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="57114151-313a-4d50-bf79-8d92e5c50208" containerName="extract-utilities" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701709 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701716 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701727 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerName="extract-content" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701734 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerName="extract-content" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701743 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701750 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: E0930 05:32:17.701762 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerName="extract-utilities" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701769 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerName="extract-utilities" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701884 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="050f9843-c228-4681-96e9-8649f7eff6fc" containerName="route-controller-manager" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701900 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d133d89e-16c0-45d9-9438-58398a70ac5d" containerName="pruner" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701912 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f6ab45a-e36e-4328-a0ff-1a62a322133e" containerName="pruner" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701922 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec84e71d-6ee8-4833-b914-7a7c103ac14e" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701932 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="39bae241-73b1-4078-9861-309c762b38b5" containerName="controller-manager" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701963 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b351e8a-d6e9-426c-a3fc-b486964f28c7" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701973 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="57114151-313a-4d50-bf79-8d92e5c50208" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.701987 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa47aa22-683e-413d-b004-054e1e331342" containerName="registry-server" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.702730 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.706284 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6"] Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.706962 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.707102 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.710556 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.710935 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.711032 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.711220 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.711255 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.711266 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.712573 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-695fdf4d89-ltt82"] Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.713251 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.713678 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.713850 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.713871 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.713977 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.714720 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6"] Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.723513 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869232 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knnkx\" (UniqueName: \"kubernetes.io/projected/a2dc958d-4809-485f-9689-01729c86fbad-kube-api-access-knnkx\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869293 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-client-ca\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869314 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-client-ca\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869329 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2dc958d-4809-485f-9689-01729c86fbad-serving-cert\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869349 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-proxy-ca-bundles\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869374 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv9rh\" (UniqueName: \"kubernetes.io/projected/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-kube-api-access-cv9rh\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869505 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-config\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869637 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-config\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.869705 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-serving-cert\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970285 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-client-ca\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970324 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-client-ca\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970344 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2dc958d-4809-485f-9689-01729c86fbad-serving-cert\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970360 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-proxy-ca-bundles\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970376 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv9rh\" (UniqueName: \"kubernetes.io/projected/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-kube-api-access-cv9rh\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970401 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-config\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970426 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-config\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970448 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-serving-cert\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.970487 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knnkx\" (UniqueName: \"kubernetes.io/projected/a2dc958d-4809-485f-9689-01729c86fbad-kube-api-access-knnkx\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.971348 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-client-ca\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.971430 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-client-ca\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.972394 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-config\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.973301 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-config\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.973551 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-proxy-ca-bundles\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.977417 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-serving-cert\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.985898 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2dc958d-4809-485f-9689-01729c86fbad-serving-cert\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.990088 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knnkx\" (UniqueName: \"kubernetes.io/projected/a2dc958d-4809-485f-9689-01729c86fbad-kube-api-access-knnkx\") pod \"route-controller-manager-5d8b8d4467-96jd6\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:17 crc kubenswrapper[4956]: I0930 05:32:17.991815 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv9rh\" (UniqueName: \"kubernetes.io/projected/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-kube-api-access-cv9rh\") pod \"controller-manager-695fdf4d89-ltt82\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.020891 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.029395 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.074038 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.074143 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.074208 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.074992 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.075092 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef" gracePeriod=600 Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.222280 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6"] Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.262143 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-695fdf4d89-ltt82"] Sep 30 05:32:18 crc kubenswrapper[4956]: W0930 05:32:18.269636 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1842a0b_4bcc_48bd_838c_0c5d81c5252c.slice/crio-fd7c895641e020a44d1ef7807a2ce4c8822990fddaca2da0e3a79afd66225cf3 WatchSource:0}: Error finding container fd7c895641e020a44d1ef7807a2ce4c8822990fddaca2da0e3a79afd66225cf3: Status 404 returned error can't find the container with id fd7c895641e020a44d1ef7807a2ce4c8822990fddaca2da0e3a79afd66225cf3 Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.351477 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="050f9843-c228-4681-96e9-8649f7eff6fc" path="/var/lib/kubelet/pods/050f9843-c228-4681-96e9-8649f7eff6fc/volumes" Sep 30 05:32:18 crc kubenswrapper[4956]: I0930 05:32:18.353643 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39bae241-73b1-4078-9861-309c762b38b5" path="/var/lib/kubelet/pods/39bae241-73b1-4078-9861-309c762b38b5/volumes" Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.034589 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" event={"ID":"a2dc958d-4809-485f-9689-01729c86fbad","Type":"ContainerStarted","Data":"597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0"} Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.034969 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.034987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" event={"ID":"a2dc958d-4809-485f-9689-01729c86fbad","Type":"ContainerStarted","Data":"0399e84697bb4bb779e7b3263e63d0f70ab02402981ec97c93cc6843d17581f9"} Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.040546 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef" exitCode=0 Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.040656 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef"} Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.040730 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"f7ecb2580b031d845f054b28f4a97a97c58fe86efb667130fe849d07f1e2cafa"} Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.048557 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" event={"ID":"d1842a0b-4bcc-48bd-838c-0c5d81c5252c","Type":"ContainerStarted","Data":"04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face"} Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.048595 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" event={"ID":"d1842a0b-4bcc-48bd-838c-0c5d81c5252c","Type":"ContainerStarted","Data":"fd7c895641e020a44d1ef7807a2ce4c8822990fddaca2da0e3a79afd66225cf3"} Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.048947 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.051062 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.054593 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" podStartSLOduration=3.054574252 podStartE2EDuration="3.054574252s" podCreationTimestamp="2025-09-30 05:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:32:19.051237645 +0000 UTC m=+209.378358170" watchObservedRunningTime="2025-09-30 05:32:19.054574252 +0000 UTC m=+209.381694838" Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.059745 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:19 crc kubenswrapper[4956]: I0930 05:32:19.071829 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" podStartSLOduration=3.071810127 podStartE2EDuration="3.071810127s" podCreationTimestamp="2025-09-30 05:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:32:19.066562308 +0000 UTC m=+209.393682823" watchObservedRunningTime="2025-09-30 05:32:19.071810127 +0000 UTC m=+209.398930642" Sep 30 05:32:36 crc kubenswrapper[4956]: I0930 05:32:36.315632 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-695fdf4d89-ltt82"] Sep 30 05:32:36 crc kubenswrapper[4956]: I0930 05:32:36.316742 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" podUID="d1842a0b-4bcc-48bd-838c-0c5d81c5252c" containerName="controller-manager" containerID="cri-o://04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face" gracePeriod=30 Sep 30 05:32:36 crc kubenswrapper[4956]: I0930 05:32:36.350973 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6"] Sep 30 05:32:36 crc kubenswrapper[4956]: I0930 05:32:36.351339 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" podUID="a2dc958d-4809-485f-9689-01729c86fbad" containerName="route-controller-manager" containerID="cri-o://597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0" gracePeriod=30 Sep 30 05:32:36 crc kubenswrapper[4956]: I0930 05:32:36.807750 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:36 crc kubenswrapper[4956]: I0930 05:32:36.867781 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.023310 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-config\") pod \"a2dc958d-4809-485f-9689-01729c86fbad\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.023366 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knnkx\" (UniqueName: \"kubernetes.io/projected/a2dc958d-4809-485f-9689-01729c86fbad-kube-api-access-knnkx\") pod \"a2dc958d-4809-485f-9689-01729c86fbad\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.023382 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-client-ca\") pod \"a2dc958d-4809-485f-9689-01729c86fbad\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.023421 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2dc958d-4809-485f-9689-01729c86fbad-serving-cert\") pod \"a2dc958d-4809-485f-9689-01729c86fbad\" (UID: \"a2dc958d-4809-485f-9689-01729c86fbad\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.026687 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-config" (OuterVolumeSpecName: "config") pod "a2dc958d-4809-485f-9689-01729c86fbad" (UID: "a2dc958d-4809-485f-9689-01729c86fbad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.026728 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-client-ca" (OuterVolumeSpecName: "client-ca") pod "a2dc958d-4809-485f-9689-01729c86fbad" (UID: "a2dc958d-4809-485f-9689-01729c86fbad"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.028221 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2dc958d-4809-485f-9689-01729c86fbad-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a2dc958d-4809-485f-9689-01729c86fbad" (UID: "a2dc958d-4809-485f-9689-01729c86fbad"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.032347 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2dc958d-4809-485f-9689-01729c86fbad-kube-api-access-knnkx" (OuterVolumeSpecName: "kube-api-access-knnkx") pod "a2dc958d-4809-485f-9689-01729c86fbad" (UID: "a2dc958d-4809-485f-9689-01729c86fbad"). InnerVolumeSpecName "kube-api-access-knnkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.124468 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-proxy-ca-bundles\") pod \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.124520 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-serving-cert\") pod \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.124556 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-client-ca\") pod \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.124622 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-config\") pod \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.124660 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv9rh\" (UniqueName: \"kubernetes.io/projected/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-kube-api-access-cv9rh\") pod \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\" (UID: \"d1842a0b-4bcc-48bd-838c-0c5d81c5252c\") " Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125545 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d1842a0b-4bcc-48bd-838c-0c5d81c5252c" (UID: "d1842a0b-4bcc-48bd-838c-0c5d81c5252c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125612 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-config" (OuterVolumeSpecName: "config") pod "d1842a0b-4bcc-48bd-838c-0c5d81c5252c" (UID: "d1842a0b-4bcc-48bd-838c-0c5d81c5252c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125596 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1842a0b-4bcc-48bd-838c-0c5d81c5252c" (UID: "d1842a0b-4bcc-48bd-838c-0c5d81c5252c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125817 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125828 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125839 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knnkx\" (UniqueName: \"kubernetes.io/projected/a2dc958d-4809-485f-9689-01729c86fbad-kube-api-access-knnkx\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125849 4956 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2dc958d-4809-485f-9689-01729c86fbad-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125856 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125864 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2dc958d-4809-485f-9689-01729c86fbad-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.125872 4956 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.127735 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-kube-api-access-cv9rh" (OuterVolumeSpecName: "kube-api-access-cv9rh") pod "d1842a0b-4bcc-48bd-838c-0c5d81c5252c" (UID: "d1842a0b-4bcc-48bd-838c-0c5d81c5252c"). InnerVolumeSpecName "kube-api-access-cv9rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.128213 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1842a0b-4bcc-48bd-838c-0c5d81c5252c" (UID: "d1842a0b-4bcc-48bd-838c-0c5d81c5252c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.150171 4956 generic.go:334] "Generic (PLEG): container finished" podID="d1842a0b-4bcc-48bd-838c-0c5d81c5252c" containerID="04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face" exitCode=0 Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.150212 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" event={"ID":"d1842a0b-4bcc-48bd-838c-0c5d81c5252c","Type":"ContainerDied","Data":"04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face"} Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.150262 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.150306 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-695fdf4d89-ltt82" event={"ID":"d1842a0b-4bcc-48bd-838c-0c5d81c5252c","Type":"ContainerDied","Data":"fd7c895641e020a44d1ef7807a2ce4c8822990fddaca2da0e3a79afd66225cf3"} Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.150355 4956 scope.go:117] "RemoveContainer" containerID="04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.151541 4956 generic.go:334] "Generic (PLEG): container finished" podID="a2dc958d-4809-485f-9689-01729c86fbad" containerID="597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0" exitCode=0 Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.151571 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" event={"ID":"a2dc958d-4809-485f-9689-01729c86fbad","Type":"ContainerDied","Data":"597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0"} Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.151597 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" event={"ID":"a2dc958d-4809-485f-9689-01729c86fbad","Type":"ContainerDied","Data":"0399e84697bb4bb779e7b3263e63d0f70ab02402981ec97c93cc6843d17581f9"} Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.151643 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.170432 4956 scope.go:117] "RemoveContainer" containerID="04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face" Sep 30 05:32:37 crc kubenswrapper[4956]: E0930 05:32:37.171100 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face\": container with ID starting with 04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face not found: ID does not exist" containerID="04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.171205 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face"} err="failed to get container status \"04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face\": rpc error: code = NotFound desc = could not find container \"04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face\": container with ID starting with 04572b2d49ee96c09251aafa85e294f23f0a10ff42986a5b1c5749172901face not found: ID does not exist" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.171247 4956 scope.go:117] "RemoveContainer" containerID="597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.177058 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6"] Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.179491 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d8b8d4467-96jd6"] Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.190255 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-695fdf4d89-ltt82"] Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.191932 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-695fdf4d89-ltt82"] Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.194825 4956 scope.go:117] "RemoveContainer" containerID="597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0" Sep 30 05:32:37 crc kubenswrapper[4956]: E0930 05:32:37.195474 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0\": container with ID starting with 597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0 not found: ID does not exist" containerID="597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.195548 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0"} err="failed to get container status \"597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0\": rpc error: code = NotFound desc = could not find container \"597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0\": container with ID starting with 597cdc03a8c2ce72bad111dc75fa8d326cf8158da4ac521230b25905ad8e14f0 not found: ID does not exist" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.227307 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv9rh\" (UniqueName: \"kubernetes.io/projected/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-kube-api-access-cv9rh\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.227346 4956 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1842a0b-4bcc-48bd-838c-0c5d81c5252c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.720284 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6"] Sep 30 05:32:37 crc kubenswrapper[4956]: E0930 05:32:37.720934 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1842a0b-4bcc-48bd-838c-0c5d81c5252c" containerName="controller-manager" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.720948 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1842a0b-4bcc-48bd-838c-0c5d81c5252c" containerName="controller-manager" Sep 30 05:32:37 crc kubenswrapper[4956]: E0930 05:32:37.720962 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2dc958d-4809-485f-9689-01729c86fbad" containerName="route-controller-manager" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.720970 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2dc958d-4809-485f-9689-01729c86fbad" containerName="route-controller-manager" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.721089 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1842a0b-4bcc-48bd-838c-0c5d81c5252c" containerName="controller-manager" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.721103 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2dc958d-4809-485f-9689-01729c86fbad" containerName="route-controller-manager" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.721501 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.723336 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.723785 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.724039 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.724094 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.724140 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.725058 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.725554 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b44bbf667-6nlhs"] Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.726169 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.728626 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.730251 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.731062 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.731233 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.731367 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.731547 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6"] Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.734684 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.740753 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b44bbf667-6nlhs"] Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.749008 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.834578 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9be7381f-be01-4a78-9048-44aa82faa743-client-ca\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.834639 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgfnw\" (UniqueName: \"kubernetes.io/projected/9be7381f-be01-4a78-9048-44aa82faa743-kube-api-access-zgfnw\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.834685 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8196d04-3adc-49ca-9137-d84f4d07cbce-serving-cert\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.834775 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-client-ca\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.834794 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-proxy-ca-bundles\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.834831 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be7381f-be01-4a78-9048-44aa82faa743-config\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.834859 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd9kh\" (UniqueName: \"kubernetes.io/projected/e8196d04-3adc-49ca-9137-d84f4d07cbce-kube-api-access-gd9kh\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.834879 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-config\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.835029 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be7381f-be01-4a78-9048-44aa82faa743-serving-cert\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.935957 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be7381f-be01-4a78-9048-44aa82faa743-serving-cert\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.935997 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9be7381f-be01-4a78-9048-44aa82faa743-client-ca\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.936019 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgfnw\" (UniqueName: \"kubernetes.io/projected/9be7381f-be01-4a78-9048-44aa82faa743-kube-api-access-zgfnw\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.936042 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8196d04-3adc-49ca-9137-d84f4d07cbce-serving-cert\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.936075 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-client-ca\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.936092 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-proxy-ca-bundles\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.936112 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be7381f-be01-4a78-9048-44aa82faa743-config\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.936153 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9kh\" (UniqueName: \"kubernetes.io/projected/e8196d04-3adc-49ca-9137-d84f4d07cbce-kube-api-access-gd9kh\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.936173 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-config\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.937377 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-config\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.938785 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-client-ca\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.938890 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be7381f-be01-4a78-9048-44aa82faa743-config\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.938965 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9be7381f-be01-4a78-9048-44aa82faa743-client-ca\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.939168 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8196d04-3adc-49ca-9137-d84f4d07cbce-proxy-ca-bundles\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.941260 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be7381f-be01-4a78-9048-44aa82faa743-serving-cert\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.942624 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8196d04-3adc-49ca-9137-d84f4d07cbce-serving-cert\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.954189 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd9kh\" (UniqueName: \"kubernetes.io/projected/e8196d04-3adc-49ca-9137-d84f4d07cbce-kube-api-access-gd9kh\") pod \"controller-manager-b44bbf667-6nlhs\" (UID: \"e8196d04-3adc-49ca-9137-d84f4d07cbce\") " pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:37 crc kubenswrapper[4956]: I0930 05:32:37.955700 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgfnw\" (UniqueName: \"kubernetes.io/projected/9be7381f-be01-4a78-9048-44aa82faa743-kube-api-access-zgfnw\") pod \"route-controller-manager-656fb7fcc7-j8vx6\" (UID: \"9be7381f-be01-4a78-9048-44aa82faa743\") " pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:38 crc kubenswrapper[4956]: I0930 05:32:38.070232 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:38 crc kubenswrapper[4956]: I0930 05:32:38.076669 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:38 crc kubenswrapper[4956]: I0930 05:32:38.266362 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6"] Sep 30 05:32:38 crc kubenswrapper[4956]: W0930 05:32:38.276547 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9be7381f_be01_4a78_9048_44aa82faa743.slice/crio-6a8a0c72dc021a4321d067ee2b1493f2b25201e90040ab0935b763dcd8ca1171 WatchSource:0}: Error finding container 6a8a0c72dc021a4321d067ee2b1493f2b25201e90040ab0935b763dcd8ca1171: Status 404 returned error can't find the container with id 6a8a0c72dc021a4321d067ee2b1493f2b25201e90040ab0935b763dcd8ca1171 Sep 30 05:32:38 crc kubenswrapper[4956]: I0930 05:32:38.305592 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b44bbf667-6nlhs"] Sep 30 05:32:38 crc kubenswrapper[4956]: W0930 05:32:38.314302 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8196d04_3adc_49ca_9137_d84f4d07cbce.slice/crio-dd7a3ef719ee86f08144eaf3938c91fb7919b1e61c44b3fa5ddc1692bbe9e06e WatchSource:0}: Error finding container dd7a3ef719ee86f08144eaf3938c91fb7919b1e61c44b3fa5ddc1692bbe9e06e: Status 404 returned error can't find the container with id dd7a3ef719ee86f08144eaf3938c91fb7919b1e61c44b3fa5ddc1692bbe9e06e Sep 30 05:32:38 crc kubenswrapper[4956]: I0930 05:32:38.349788 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2dc958d-4809-485f-9689-01729c86fbad" path="/var/lib/kubelet/pods/a2dc958d-4809-485f-9689-01729c86fbad/volumes" Sep 30 05:32:38 crc kubenswrapper[4956]: I0930 05:32:38.350535 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1842a0b-4bcc-48bd-838c-0c5d81c5252c" path="/var/lib/kubelet/pods/d1842a0b-4bcc-48bd-838c-0c5d81c5252c/volumes" Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.179994 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" event={"ID":"9be7381f-be01-4a78-9048-44aa82faa743","Type":"ContainerStarted","Data":"3f796239dc9b9b08cbc23be18fdcee3e269948481f5410dfc6e9c7d730903e54"} Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.180270 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" event={"ID":"9be7381f-be01-4a78-9048-44aa82faa743","Type":"ContainerStarted","Data":"6a8a0c72dc021a4321d067ee2b1493f2b25201e90040ab0935b763dcd8ca1171"} Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.180461 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.181631 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" event={"ID":"e8196d04-3adc-49ca-9137-d84f4d07cbce","Type":"ContainerStarted","Data":"1d1d2690dc3ce9b33f4b4fe6299f0254372fe56ec4464cbc4be3c5d42e1f0427"} Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.181662 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" event={"ID":"e8196d04-3adc-49ca-9137-d84f4d07cbce","Type":"ContainerStarted","Data":"dd7a3ef719ee86f08144eaf3938c91fb7919b1e61c44b3fa5ddc1692bbe9e06e"} Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.181929 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.185912 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.186382 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.199322 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-656fb7fcc7-j8vx6" podStartSLOduration=3.199307967 podStartE2EDuration="3.199307967s" podCreationTimestamp="2025-09-30 05:32:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:32:39.196741955 +0000 UTC m=+229.523862490" watchObservedRunningTime="2025-09-30 05:32:39.199307967 +0000 UTC m=+229.526428492" Sep 30 05:32:39 crc kubenswrapper[4956]: I0930 05:32:39.214596 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b44bbf667-6nlhs" podStartSLOduration=3.214581389 podStartE2EDuration="3.214581389s" podCreationTimestamp="2025-09-30 05:32:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:32:39.213965349 +0000 UTC m=+229.541085894" watchObservedRunningTime="2025-09-30 05:32:39.214581389 +0000 UTC m=+229.541701914" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.275860 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmr6c"] Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.277441 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mmr6c" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerName="registry-server" containerID="cri-o://49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138" gracePeriod=30 Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.283324 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4nrh7"] Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.288323 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4nrh7" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerName="registry-server" containerID="cri-o://62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959" gracePeriod=30 Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.291308 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvfx8"] Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.291497 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" podUID="9bdc12c7-725c-4496-8e0e-e4ec3d911cca" containerName="marketplace-operator" containerID="cri-o://a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e" gracePeriod=30 Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.307065 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmln2"] Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.307597 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lmln2" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerName="registry-server" containerID="cri-o://8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839" gracePeriod=30 Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.322557 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2zl8q"] Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.323311 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.342696 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gv9cd"] Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.343184 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gv9cd" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerName="registry-server" containerID="cri-o://795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101" gracePeriod=30 Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.354922 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2zl8q"] Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.469895 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdqzs\" (UniqueName: \"kubernetes.io/projected/747a9b33-025d-4b52-9b54-7d1b829c6cef-kube-api-access-rdqzs\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.470024 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/747a9b33-025d-4b52-9b54-7d1b829c6cef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.470170 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/747a9b33-025d-4b52-9b54-7d1b829c6cef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.570983 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/747a9b33-025d-4b52-9b54-7d1b829c6cef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.571066 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdqzs\" (UniqueName: \"kubernetes.io/projected/747a9b33-025d-4b52-9b54-7d1b829c6cef-kube-api-access-rdqzs\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.571099 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/747a9b33-025d-4b52-9b54-7d1b829c6cef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.573685 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/747a9b33-025d-4b52-9b54-7d1b829c6cef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.578716 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/747a9b33-025d-4b52-9b54-7d1b829c6cef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.587582 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdqzs\" (UniqueName: \"kubernetes.io/projected/747a9b33-025d-4b52-9b54-7d1b829c6cef-kube-api-access-rdqzs\") pod \"marketplace-operator-79b997595-2zl8q\" (UID: \"747a9b33-025d-4b52-9b54-7d1b829c6cef\") " pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.727483 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.728187 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.821519 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.822779 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.877328 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs9gq\" (UniqueName: \"kubernetes.io/projected/cb546d14-6d5d-4b44-b501-070ea2251e4b-kube-api-access-fs9gq\") pod \"cb546d14-6d5d-4b44-b501-070ea2251e4b\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.877845 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-catalog-content\") pod \"cb546d14-6d5d-4b44-b501-070ea2251e4b\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.877896 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-utilities\") pod \"cb546d14-6d5d-4b44-b501-070ea2251e4b\" (UID: \"cb546d14-6d5d-4b44-b501-070ea2251e4b\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.879790 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-utilities" (OuterVolumeSpecName: "utilities") pod "cb546d14-6d5d-4b44-b501-070ea2251e4b" (UID: "cb546d14-6d5d-4b44-b501-070ea2251e4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.881390 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb546d14-6d5d-4b44-b501-070ea2251e4b-kube-api-access-fs9gq" (OuterVolumeSpecName: "kube-api-access-fs9gq") pod "cb546d14-6d5d-4b44-b501-070ea2251e4b" (UID: "cb546d14-6d5d-4b44-b501-070ea2251e4b"). InnerVolumeSpecName "kube-api-access-fs9gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.911689 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.925601 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb546d14-6d5d-4b44-b501-070ea2251e4b" (UID: "cb546d14-6d5d-4b44-b501-070ea2251e4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.936741 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979324 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kzn2\" (UniqueName: \"kubernetes.io/projected/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-kube-api-access-5kzn2\") pod \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979427 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8r2d\" (UniqueName: \"kubernetes.io/projected/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-kube-api-access-z8r2d\") pod \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979478 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-operator-metrics\") pod \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979506 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-catalog-content\") pod \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979546 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-trusted-ca\") pod \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\" (UID: \"9bdc12c7-725c-4496-8e0e-e4ec3d911cca\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979604 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-utilities\") pod \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\" (UID: \"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0\") " Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979853 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979869 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb546d14-6d5d-4b44-b501-070ea2251e4b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.979886 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs9gq\" (UniqueName: \"kubernetes.io/projected/cb546d14-6d5d-4b44-b501-070ea2251e4b-kube-api-access-fs9gq\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.980370 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "9bdc12c7-725c-4496-8e0e-e4ec3d911cca" (UID: "9bdc12c7-725c-4496-8e0e-e4ec3d911cca"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.981107 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-utilities" (OuterVolumeSpecName: "utilities") pod "4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" (UID: "4eb4328d-e04b-43d6-98d0-9c4f9bb615d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.982158 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-kube-api-access-5kzn2" (OuterVolumeSpecName: "kube-api-access-5kzn2") pod "4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" (UID: "4eb4328d-e04b-43d6-98d0-9c4f9bb615d0"). InnerVolumeSpecName "kube-api-access-5kzn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.982658 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "9bdc12c7-725c-4496-8e0e-e4ec3d911cca" (UID: "9bdc12c7-725c-4496-8e0e-e4ec3d911cca"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:32:40 crc kubenswrapper[4956]: I0930 05:32:40.982942 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-kube-api-access-z8r2d" (OuterVolumeSpecName: "kube-api-access-z8r2d") pod "9bdc12c7-725c-4496-8e0e-e4ec3d911cca" (UID: "9bdc12c7-725c-4496-8e0e-e4ec3d911cca"). InnerVolumeSpecName "kube-api-access-z8r2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.053236 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" (UID: "4eb4328d-e04b-43d6-98d0-9c4f9bb615d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.080718 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js7v6\" (UniqueName: \"kubernetes.io/projected/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-kube-api-access-js7v6\") pod \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.080805 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-utilities\") pod \"ac16a38f-5e71-4eee-8189-9361cfd19b84\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.080834 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-utilities\") pod \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.080866 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr8k5\" (UniqueName: \"kubernetes.io/projected/ac16a38f-5e71-4eee-8189-9361cfd19b84-kube-api-access-cr8k5\") pod \"ac16a38f-5e71-4eee-8189-9361cfd19b84\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.080898 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-catalog-content\") pod \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\" (UID: \"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e\") " Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.080922 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-catalog-content\") pod \"ac16a38f-5e71-4eee-8189-9361cfd19b84\" (UID: \"ac16a38f-5e71-4eee-8189-9361cfd19b84\") " Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.081129 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8r2d\" (UniqueName: \"kubernetes.io/projected/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-kube-api-access-z8r2d\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.081141 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.081150 4956 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.081158 4956 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9bdc12c7-725c-4496-8e0e-e4ec3d911cca-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.081167 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.081176 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kzn2\" (UniqueName: \"kubernetes.io/projected/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0-kube-api-access-5kzn2\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.081550 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-utilities" (OuterVolumeSpecName: "utilities") pod "ac16a38f-5e71-4eee-8189-9361cfd19b84" (UID: "ac16a38f-5e71-4eee-8189-9361cfd19b84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.082562 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-utilities" (OuterVolumeSpecName: "utilities") pod "dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" (UID: "dcf78e8b-7c4c-4c27-86e5-752b265c9a1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.083704 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac16a38f-5e71-4eee-8189-9361cfd19b84-kube-api-access-cr8k5" (OuterVolumeSpecName: "kube-api-access-cr8k5") pod "ac16a38f-5e71-4eee-8189-9361cfd19b84" (UID: "ac16a38f-5e71-4eee-8189-9361cfd19b84"). InnerVolumeSpecName "kube-api-access-cr8k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.084411 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-kube-api-access-js7v6" (OuterVolumeSpecName: "kube-api-access-js7v6") pod "dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" (UID: "dcf78e8b-7c4c-4c27-86e5-752b265c9a1e"). InnerVolumeSpecName "kube-api-access-js7v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.095233 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" (UID: "dcf78e8b-7c4c-4c27-86e5-752b265c9a1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.155227 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac16a38f-5e71-4eee-8189-9361cfd19b84" (UID: "ac16a38f-5e71-4eee-8189-9361cfd19b84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.182555 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.182588 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.182600 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr8k5\" (UniqueName: \"kubernetes.io/projected/ac16a38f-5e71-4eee-8189-9361cfd19b84-kube-api-access-cr8k5\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.182611 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.182620 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac16a38f-5e71-4eee-8189-9361cfd19b84-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.182628 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js7v6\" (UniqueName: \"kubernetes.io/projected/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e-kube-api-access-js7v6\") on node \"crc\" DevicePath \"\"" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.203954 4956 generic.go:334] "Generic (PLEG): container finished" podID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerID="62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959" exitCode=0 Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.204030 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nrh7" event={"ID":"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0","Type":"ContainerDied","Data":"62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.204047 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nrh7" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.204076 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nrh7" event={"ID":"4eb4328d-e04b-43d6-98d0-9c4f9bb615d0","Type":"ContainerDied","Data":"951dc5b9dd7e7d5b4f4b6c77ec782d773c3faee534eadddea1220860158fa57d"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.204104 4956 scope.go:117] "RemoveContainer" containerID="62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.205379 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2zl8q"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.209220 4956 generic.go:334] "Generic (PLEG): container finished" podID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerID="795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101" exitCode=0 Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.209328 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv9cd" event={"ID":"ac16a38f-5e71-4eee-8189-9361cfd19b84","Type":"ContainerDied","Data":"795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.209383 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gv9cd" event={"ID":"ac16a38f-5e71-4eee-8189-9361cfd19b84","Type":"ContainerDied","Data":"7f7d15b4e3afd61a14f41d1d153dfd1b877ed015172340872a41281dfca8cd33"} Sep 30 05:32:41 crc kubenswrapper[4956]: W0930 05:32:41.209329 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod747a9b33_025d_4b52_9b54_7d1b829c6cef.slice/crio-8f81196d565485abe4046ba6a81ae40cf01e97c8b3b04686e45a3d8d9a04b12b WatchSource:0}: Error finding container 8f81196d565485abe4046ba6a81ae40cf01e97c8b3b04686e45a3d8d9a04b12b: Status 404 returned error can't find the container with id 8f81196d565485abe4046ba6a81ae40cf01e97c8b3b04686e45a3d8d9a04b12b Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.209459 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gv9cd" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.213794 4956 generic.go:334] "Generic (PLEG): container finished" podID="9bdc12c7-725c-4496-8e0e-e4ec3d911cca" containerID="a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e" exitCode=0 Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.213904 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.213953 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" event={"ID":"9bdc12c7-725c-4496-8e0e-e4ec3d911cca","Type":"ContainerDied","Data":"a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.213994 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rvfx8" event={"ID":"9bdc12c7-725c-4496-8e0e-e4ec3d911cca","Type":"ContainerDied","Data":"89d23efbfd40682105b77559e4fe9a30342e2bae5051216e9e305e1e8ebadba3"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.216878 4956 generic.go:334] "Generic (PLEG): container finished" podID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerID="8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839" exitCode=0 Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.216947 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmln2" event={"ID":"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e","Type":"ContainerDied","Data":"8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.216973 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lmln2" event={"ID":"dcf78e8b-7c4c-4c27-86e5-752b265c9a1e","Type":"ContainerDied","Data":"5708c9fa8f2e13922ff39820b54f6c7972267fb20038f3fc459eaf1066ba7f47"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.216950 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lmln2" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.221529 4956 generic.go:334] "Generic (PLEG): container finished" podID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerID="49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138" exitCode=0 Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.221587 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmr6c" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.221616 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmr6c" event={"ID":"cb546d14-6d5d-4b44-b501-070ea2251e4b","Type":"ContainerDied","Data":"49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.221643 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmr6c" event={"ID":"cb546d14-6d5d-4b44-b501-070ea2251e4b","Type":"ContainerDied","Data":"7689d7801082f87063ffa49011bc986a8e8cae285c6a77722fe82544e51828bd"} Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.235898 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4nrh7"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.238465 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4nrh7"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.252244 4956 scope.go:117] "RemoveContainer" containerID="3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.258671 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gv9cd"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.264622 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gv9cd"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.270580 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvfx8"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.272891 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rvfx8"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.281554 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmln2"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.285655 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lmln2"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.297326 4956 scope.go:117] "RemoveContainer" containerID="a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.298280 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmr6c"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.306070 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mmr6c"] Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.313064 4956 scope.go:117] "RemoveContainer" containerID="62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.313861 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959\": container with ID starting with 62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959 not found: ID does not exist" containerID="62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.313908 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959"} err="failed to get container status \"62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959\": rpc error: code = NotFound desc = could not find container \"62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959\": container with ID starting with 62b572341f3cc9f8712e24d926b025da7579e0e343fbfe433dcce1e802eb9959 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.313936 4956 scope.go:117] "RemoveContainer" containerID="3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.314760 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333\": container with ID starting with 3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333 not found: ID does not exist" containerID="3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.314822 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333"} err="failed to get container status \"3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333\": rpc error: code = NotFound desc = could not find container \"3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333\": container with ID starting with 3df2aeb034252bcf737d67d89dc2411610633551ea872358a1063cc74e6aa333 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.314843 4956 scope.go:117] "RemoveContainer" containerID="a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.315254 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21\": container with ID starting with a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21 not found: ID does not exist" containerID="a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.315294 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21"} err="failed to get container status \"a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21\": rpc error: code = NotFound desc = could not find container \"a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21\": container with ID starting with a995d6987a46f32e05b32a1f19b702b7ade8afb18380ebf6227190a4e4b8af21 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.315327 4956 scope.go:117] "RemoveContainer" containerID="795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.333268 4956 scope.go:117] "RemoveContainer" containerID="8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.355924 4956 scope.go:117] "RemoveContainer" containerID="eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.373811 4956 scope.go:117] "RemoveContainer" containerID="795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.374422 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101\": container with ID starting with 795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101 not found: ID does not exist" containerID="795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.374473 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101"} err="failed to get container status \"795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101\": rpc error: code = NotFound desc = could not find container \"795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101\": container with ID starting with 795b8530a197bcb932dca82ea6319a06c93ce8e8d4f3be96e4436a4497056101 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.374508 4956 scope.go:117] "RemoveContainer" containerID="8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.375393 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b\": container with ID starting with 8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b not found: ID does not exist" containerID="8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.375456 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b"} err="failed to get container status \"8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b\": rpc error: code = NotFound desc = could not find container \"8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b\": container with ID starting with 8d64d3bc992a15bd7faa98a8bf7eb41d6d49adea26a8dd835a64aa3a031a8c4b not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.375493 4956 scope.go:117] "RemoveContainer" containerID="eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.375889 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345\": container with ID starting with eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345 not found: ID does not exist" containerID="eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.375914 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345"} err="failed to get container status \"eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345\": rpc error: code = NotFound desc = could not find container \"eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345\": container with ID starting with eefa24cc4f78dfd3afe0b1008eb10d938b818a6b80e120f49f591e9062336345 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.375933 4956 scope.go:117] "RemoveContainer" containerID="a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.387742 4956 scope.go:117] "RemoveContainer" containerID="a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.388216 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e\": container with ID starting with a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e not found: ID does not exist" containerID="a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.388252 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e"} err="failed to get container status \"a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e\": rpc error: code = NotFound desc = could not find container \"a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e\": container with ID starting with a2d50a12688e20a3dc31b55f4ec54a49172790e50bc8ad3c5127b82a1db26e3e not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.388274 4956 scope.go:117] "RemoveContainer" containerID="8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.402904 4956 scope.go:117] "RemoveContainer" containerID="d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.429041 4956 scope.go:117] "RemoveContainer" containerID="7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.461247 4956 scope.go:117] "RemoveContainer" containerID="8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.462408 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839\": container with ID starting with 8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839 not found: ID does not exist" containerID="8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.462523 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839"} err="failed to get container status \"8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839\": rpc error: code = NotFound desc = could not find container \"8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839\": container with ID starting with 8d399b689a16edd0eaa7bf7ae361d1e33af072bb1bb7523bf9aaa5f099e3c839 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.462726 4956 scope.go:117] "RemoveContainer" containerID="d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.464239 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696\": container with ID starting with d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696 not found: ID does not exist" containerID="d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.464298 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696"} err="failed to get container status \"d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696\": rpc error: code = NotFound desc = could not find container \"d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696\": container with ID starting with d1954c53ce0fad01a5d32f4ec39e949816f535551e8458c8e88f624318ad0696 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.464340 4956 scope.go:117] "RemoveContainer" containerID="7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.465598 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498\": container with ID starting with 7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498 not found: ID does not exist" containerID="7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.465660 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498"} err="failed to get container status \"7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498\": rpc error: code = NotFound desc = could not find container \"7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498\": container with ID starting with 7211b5a08eaa6638ad92cff3abdee9d8c5354a2a33cb7dae524485ad35bc5498 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.465694 4956 scope.go:117] "RemoveContainer" containerID="49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.479335 4956 scope.go:117] "RemoveContainer" containerID="a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.496751 4956 scope.go:117] "RemoveContainer" containerID="0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.512383 4956 scope.go:117] "RemoveContainer" containerID="49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.513312 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138\": container with ID starting with 49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138 not found: ID does not exist" containerID="49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.513678 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138"} err="failed to get container status \"49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138\": rpc error: code = NotFound desc = could not find container \"49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138\": container with ID starting with 49518c7985712252cac61cb40f1290d2fe5b931e62ba803dc0dee81a93ce1138 not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.513702 4956 scope.go:117] "RemoveContainer" containerID="a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.514029 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf\": container with ID starting with a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf not found: ID does not exist" containerID="a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.514082 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf"} err="failed to get container status \"a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf\": rpc error: code = NotFound desc = could not find container \"a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf\": container with ID starting with a8e51280b0329a037aafa3822cb5584aa0b98182a6889a7ad46c91b97e841ecf not found: ID does not exist" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.514135 4956 scope.go:117] "RemoveContainer" containerID="0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e" Sep 30 05:32:41 crc kubenswrapper[4956]: E0930 05:32:41.515432 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e\": container with ID starting with 0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e not found: ID does not exist" containerID="0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e" Sep 30 05:32:41 crc kubenswrapper[4956]: I0930 05:32:41.515455 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e"} err="failed to get container status \"0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e\": rpc error: code = NotFound desc = could not find container \"0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e\": container with ID starting with 0740c5e8652d2e9f72efecd152b50ae0c4c1f0bc99531ffa83a7ee9f85ce8c6e not found: ID does not exist" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.229961 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" event={"ID":"747a9b33-025d-4b52-9b54-7d1b829c6cef","Type":"ContainerStarted","Data":"2e62089e6956f59032afd354b5e0c29cd5e30c5e67ee0a3cbe9f9f3cc519789a"} Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.229997 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" event={"ID":"747a9b33-025d-4b52-9b54-7d1b829c6cef","Type":"ContainerStarted","Data":"8f81196d565485abe4046ba6a81ae40cf01e97c8b3b04686e45a3d8d9a04b12b"} Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.230945 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.236718 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.246669 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2zl8q" podStartSLOduration=2.246642783 podStartE2EDuration="2.246642783s" podCreationTimestamp="2025-09-30 05:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:32:42.242930643 +0000 UTC m=+232.570051168" watchObservedRunningTime="2025-09-30 05:32:42.246642783 +0000 UTC m=+232.573763318" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.350174 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" path="/var/lib/kubelet/pods/4eb4328d-e04b-43d6-98d0-9c4f9bb615d0/volumes" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.351149 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bdc12c7-725c-4496-8e0e-e4ec3d911cca" path="/var/lib/kubelet/pods/9bdc12c7-725c-4496-8e0e-e4ec3d911cca/volumes" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.351684 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" path="/var/lib/kubelet/pods/ac16a38f-5e71-4eee-8189-9361cfd19b84/volumes" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.352891 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" path="/var/lib/kubelet/pods/cb546d14-6d5d-4b44-b501-070ea2251e4b/volumes" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.353483 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" path="/var/lib/kubelet/pods/dcf78e8b-7c4c-4c27-86e5-752b265c9a1e/volumes" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493600 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rjpbl"] Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493797 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdc12c7-725c-4496-8e0e-e4ec3d911cca" containerName="marketplace-operator" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493808 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdc12c7-725c-4496-8e0e-e4ec3d911cca" containerName="marketplace-operator" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493820 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493827 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493836 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerName="extract-utilities" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493842 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerName="extract-utilities" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493849 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerName="extract-content" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493856 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerName="extract-content" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493862 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerName="extract-utilities" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493868 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerName="extract-utilities" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493881 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerName="extract-utilities" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493886 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerName="extract-utilities" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493894 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493900 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493907 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerName="extract-content" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493914 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerName="extract-content" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493920 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerName="extract-utilities" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493926 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerName="extract-utilities" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493936 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493944 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493951 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerName="extract-content" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493958 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerName="extract-content" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493967 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerName="extract-content" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493973 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerName="extract-content" Sep 30 05:32:42 crc kubenswrapper[4956]: E0930 05:32:42.493980 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.493985 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.494059 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb4328d-e04b-43d6-98d0-9c4f9bb615d0" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.494072 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb546d14-6d5d-4b44-b501-070ea2251e4b" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.494081 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf78e8b-7c4c-4c27-86e5-752b265c9a1e" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.494089 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac16a38f-5e71-4eee-8189-9361cfd19b84" containerName="registry-server" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.494096 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bdc12c7-725c-4496-8e0e-e4ec3d911cca" containerName="marketplace-operator" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.496484 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.498923 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.502268 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjpbl"] Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.600266 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk44r\" (UniqueName: \"kubernetes.io/projected/4b003c5c-9c5c-4732-9370-c49aed57a7c2-kube-api-access-vk44r\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.600311 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b003c5c-9c5c-4732-9370-c49aed57a7c2-catalog-content\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.600378 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b003c5c-9c5c-4732-9370-c49aed57a7c2-utilities\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.695851 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q9m7s"] Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.696984 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.699000 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.701159 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b003c5c-9c5c-4732-9370-c49aed57a7c2-utilities\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.701230 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk44r\" (UniqueName: \"kubernetes.io/projected/4b003c5c-9c5c-4732-9370-c49aed57a7c2-kube-api-access-vk44r\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.701262 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b003c5c-9c5c-4732-9370-c49aed57a7c2-catalog-content\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.701704 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b003c5c-9c5c-4732-9370-c49aed57a7c2-catalog-content\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.702053 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b003c5c-9c5c-4732-9370-c49aed57a7c2-utilities\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.706821 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q9m7s"] Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.724469 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk44r\" (UniqueName: \"kubernetes.io/projected/4b003c5c-9c5c-4732-9370-c49aed57a7c2-kube-api-access-vk44r\") pod \"redhat-marketplace-rjpbl\" (UID: \"4b003c5c-9c5c-4732-9370-c49aed57a7c2\") " pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.802711 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de4dac4b-b680-4e12-beb2-a999348eddd7-utilities\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.803164 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfgfv\" (UniqueName: \"kubernetes.io/projected/de4dac4b-b680-4e12-beb2-a999348eddd7-kube-api-access-jfgfv\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.803192 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de4dac4b-b680-4e12-beb2-a999348eddd7-catalog-content\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.816586 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.904892 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfgfv\" (UniqueName: \"kubernetes.io/projected/de4dac4b-b680-4e12-beb2-a999348eddd7-kube-api-access-jfgfv\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.904938 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de4dac4b-b680-4e12-beb2-a999348eddd7-catalog-content\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.905019 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de4dac4b-b680-4e12-beb2-a999348eddd7-utilities\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.906013 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de4dac4b-b680-4e12-beb2-a999348eddd7-utilities\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.906537 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de4dac4b-b680-4e12-beb2-a999348eddd7-catalog-content\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:42 crc kubenswrapper[4956]: I0930 05:32:42.928202 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfgfv\" (UniqueName: \"kubernetes.io/projected/de4dac4b-b680-4e12-beb2-a999348eddd7-kube-api-access-jfgfv\") pod \"redhat-operators-q9m7s\" (UID: \"de4dac4b-b680-4e12-beb2-a999348eddd7\") " pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:43 crc kubenswrapper[4956]: I0930 05:32:43.042077 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:43 crc kubenswrapper[4956]: I0930 05:32:43.222648 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjpbl"] Sep 30 05:32:43 crc kubenswrapper[4956]: I0930 05:32:43.238487 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjpbl" event={"ID":"4b003c5c-9c5c-4732-9370-c49aed57a7c2","Type":"ContainerStarted","Data":"360e950249fcab0356812ebccc9a67b19360ff97577ede1f927548274ac44e28"} Sep 30 05:32:43 crc kubenswrapper[4956]: I0930 05:32:43.436539 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q9m7s"] Sep 30 05:32:43 crc kubenswrapper[4956]: W0930 05:32:43.442618 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde4dac4b_b680_4e12_beb2_a999348eddd7.slice/crio-50f8fc648b42124599b6b21a9e8f452ba4444e8c4bcc7376dab4ea783594e583 WatchSource:0}: Error finding container 50f8fc648b42124599b6b21a9e8f452ba4444e8c4bcc7376dab4ea783594e583: Status 404 returned error can't find the container with id 50f8fc648b42124599b6b21a9e8f452ba4444e8c4bcc7376dab4ea783594e583 Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.243464 4956 generic.go:334] "Generic (PLEG): container finished" podID="4b003c5c-9c5c-4732-9370-c49aed57a7c2" containerID="f1d5e43cf91985315cbc20b744591eac29eb5a3458d2f3a6ff251bada5397da5" exitCode=0 Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.243553 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjpbl" event={"ID":"4b003c5c-9c5c-4732-9370-c49aed57a7c2","Type":"ContainerDied","Data":"f1d5e43cf91985315cbc20b744591eac29eb5a3458d2f3a6ff251bada5397da5"} Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.245786 4956 generic.go:334] "Generic (PLEG): container finished" podID="de4dac4b-b680-4e12-beb2-a999348eddd7" containerID="adb1c240c4bb8321905862ef6bd60a045509051f2f37321abd93d8ff8197ecf8" exitCode=0 Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.245892 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9m7s" event={"ID":"de4dac4b-b680-4e12-beb2-a999348eddd7","Type":"ContainerDied","Data":"adb1c240c4bb8321905862ef6bd60a045509051f2f37321abd93d8ff8197ecf8"} Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.245957 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9m7s" event={"ID":"de4dac4b-b680-4e12-beb2-a999348eddd7","Type":"ContainerStarted","Data":"50f8fc648b42124599b6b21a9e8f452ba4444e8c4bcc7376dab4ea783594e583"} Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.891746 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nmgpv"] Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.893171 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.895274 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 05:32:44 crc kubenswrapper[4956]: I0930 05:32:44.903609 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmgpv"] Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.034159 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bmjb\" (UniqueName: \"kubernetes.io/projected/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-kube-api-access-2bmjb\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.034247 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-catalog-content\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.034326 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-utilities\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.094435 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k2bs6"] Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.095325 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.097410 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.166329 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bmjb\" (UniqueName: \"kubernetes.io/projected/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-kube-api-access-2bmjb\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.166360 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-catalog-content\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.166385 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhpk\" (UniqueName: \"kubernetes.io/projected/23fe685f-5eac-43e5-b4fc-c48e85553142-kube-api-access-9hhpk\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.166424 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-utilities\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.166448 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-utilities\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.166875 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-catalog-content\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.166952 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-utilities\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.166994 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-catalog-content\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.169804 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k2bs6"] Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.190950 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bmjb\" (UniqueName: \"kubernetes.io/projected/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-kube-api-access-2bmjb\") pod \"community-operators-nmgpv\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.213731 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.256563 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjpbl" event={"ID":"4b003c5c-9c5c-4732-9370-c49aed57a7c2","Type":"ContainerStarted","Data":"894f07dad768368b299af338ba3360eb9398836f3a63e3733070a4c1e2149843"} Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.262300 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9m7s" event={"ID":"de4dac4b-b680-4e12-beb2-a999348eddd7","Type":"ContainerStarted","Data":"84ce65a6b7205e667e9ea53d3bd42b0da135954dc26f02a2fff8bdbc57756c59"} Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.267700 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-catalog-content\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.268748 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-catalog-content\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.271810 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhpk\" (UniqueName: \"kubernetes.io/projected/23fe685f-5eac-43e5-b4fc-c48e85553142-kube-api-access-9hhpk\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.271913 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-utilities\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.272330 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-utilities\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.295389 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhpk\" (UniqueName: \"kubernetes.io/projected/23fe685f-5eac-43e5-b4fc-c48e85553142-kube-api-access-9hhpk\") pod \"certified-operators-k2bs6\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.481134 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.605800 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmgpv"] Sep 30 05:32:45 crc kubenswrapper[4956]: I0930 05:32:45.884809 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k2bs6"] Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.271511 4956 generic.go:334] "Generic (PLEG): container finished" podID="4b003c5c-9c5c-4732-9370-c49aed57a7c2" containerID="894f07dad768368b299af338ba3360eb9398836f3a63e3733070a4c1e2149843" exitCode=0 Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.271560 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjpbl" event={"ID":"4b003c5c-9c5c-4732-9370-c49aed57a7c2","Type":"ContainerDied","Data":"894f07dad768368b299af338ba3360eb9398836f3a63e3733070a4c1e2149843"} Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.273503 4956 generic.go:334] "Generic (PLEG): container finished" podID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerID="2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a" exitCode=0 Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.273551 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgpv" event={"ID":"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5","Type":"ContainerDied","Data":"2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a"} Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.273720 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgpv" event={"ID":"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5","Type":"ContainerStarted","Data":"62a79d2b5f87f317bd987f9f8954e33ea1a5dc00d2cd86ee2ef11ce84ad637a6"} Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.277914 4956 generic.go:334] "Generic (PLEG): container finished" podID="23fe685f-5eac-43e5-b4fc-c48e85553142" containerID="9ef971135706cda34fd6e50b70f0e2b967d33ec1a50dd4d08a9055547376358d" exitCode=0 Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.277961 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bs6" event={"ID":"23fe685f-5eac-43e5-b4fc-c48e85553142","Type":"ContainerDied","Data":"9ef971135706cda34fd6e50b70f0e2b967d33ec1a50dd4d08a9055547376358d"} Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.277990 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bs6" event={"ID":"23fe685f-5eac-43e5-b4fc-c48e85553142","Type":"ContainerStarted","Data":"2b3bc720657a92d4954ecfc5f0c440ee03285e0662f09ea193a6a0e22204e3cb"} Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.283722 4956 generic.go:334] "Generic (PLEG): container finished" podID="de4dac4b-b680-4e12-beb2-a999348eddd7" containerID="84ce65a6b7205e667e9ea53d3bd42b0da135954dc26f02a2fff8bdbc57756c59" exitCode=0 Sep 30 05:32:46 crc kubenswrapper[4956]: I0930 05:32:46.283817 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9m7s" event={"ID":"de4dac4b-b680-4e12-beb2-a999348eddd7","Type":"ContainerDied","Data":"84ce65a6b7205e667e9ea53d3bd42b0da135954dc26f02a2fff8bdbc57756c59"} Sep 30 05:32:47 crc kubenswrapper[4956]: I0930 05:32:47.299313 4956 generic.go:334] "Generic (PLEG): container finished" podID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerID="54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91" exitCode=0 Sep 30 05:32:47 crc kubenswrapper[4956]: I0930 05:32:47.299385 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgpv" event={"ID":"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5","Type":"ContainerDied","Data":"54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91"} Sep 30 05:32:47 crc kubenswrapper[4956]: I0930 05:32:47.303277 4956 generic.go:334] "Generic (PLEG): container finished" podID="23fe685f-5eac-43e5-b4fc-c48e85553142" containerID="a0df705ba135beb14501c9835257f8b069e2c8f752763ab543345307dc098921" exitCode=0 Sep 30 05:32:47 crc kubenswrapper[4956]: I0930 05:32:47.303403 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bs6" event={"ID":"23fe685f-5eac-43e5-b4fc-c48e85553142","Type":"ContainerDied","Data":"a0df705ba135beb14501c9835257f8b069e2c8f752763ab543345307dc098921"} Sep 30 05:32:47 crc kubenswrapper[4956]: I0930 05:32:47.319326 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q9m7s" event={"ID":"de4dac4b-b680-4e12-beb2-a999348eddd7","Type":"ContainerStarted","Data":"ba669be1c371b828edb325c9a286fdd8e50fd6b69cb2b9dc4809dacee40fe19e"} Sep 30 05:32:47 crc kubenswrapper[4956]: I0930 05:32:47.329393 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjpbl" event={"ID":"4b003c5c-9c5c-4732-9370-c49aed57a7c2","Type":"ContainerStarted","Data":"f50ba4c1bd2e772aebda72bd272480981181d30032e650738415737e2d0276c2"} Sep 30 05:32:47 crc kubenswrapper[4956]: I0930 05:32:47.355553 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q9m7s" podStartSLOduration=2.871517258 podStartE2EDuration="5.355530151s" podCreationTimestamp="2025-09-30 05:32:42 +0000 UTC" firstStartedPulling="2025-09-30 05:32:44.246707867 +0000 UTC m=+234.573828402" lastFinishedPulling="2025-09-30 05:32:46.73072076 +0000 UTC m=+237.057841295" observedRunningTime="2025-09-30 05:32:47.3523817 +0000 UTC m=+237.679502225" watchObservedRunningTime="2025-09-30 05:32:47.355530151 +0000 UTC m=+237.682650676" Sep 30 05:32:47 crc kubenswrapper[4956]: I0930 05:32:47.395064 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rjpbl" podStartSLOduration=2.944378262 podStartE2EDuration="5.395039452s" podCreationTimestamp="2025-09-30 05:32:42 +0000 UTC" firstStartedPulling="2025-09-30 05:32:44.244900659 +0000 UTC m=+234.572021204" lastFinishedPulling="2025-09-30 05:32:46.695561839 +0000 UTC m=+237.022682394" observedRunningTime="2025-09-30 05:32:47.391441596 +0000 UTC m=+237.718562141" watchObservedRunningTime="2025-09-30 05:32:47.395039452 +0000 UTC m=+237.722159987" Sep 30 05:32:49 crc kubenswrapper[4956]: I0930 05:32:49.347961 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgpv" event={"ID":"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5","Type":"ContainerStarted","Data":"4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de"} Sep 30 05:32:49 crc kubenswrapper[4956]: I0930 05:32:49.353714 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bs6" event={"ID":"23fe685f-5eac-43e5-b4fc-c48e85553142","Type":"ContainerStarted","Data":"e93bf60e387308ee14c39595df27547e8d8f0634ce2ccac5af62b3b4577d2715"} Sep 30 05:32:49 crc kubenswrapper[4956]: I0930 05:32:49.371642 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nmgpv" podStartSLOduration=3.920793957 podStartE2EDuration="5.37160929s" podCreationTimestamp="2025-09-30 05:32:44 +0000 UTC" firstStartedPulling="2025-09-30 05:32:46.275670441 +0000 UTC m=+236.602790966" lastFinishedPulling="2025-09-30 05:32:47.726485774 +0000 UTC m=+238.053606299" observedRunningTime="2025-09-30 05:32:49.363477818 +0000 UTC m=+239.690598343" watchObservedRunningTime="2025-09-30 05:32:49.37160929 +0000 UTC m=+239.698729855" Sep 30 05:32:49 crc kubenswrapper[4956]: I0930 05:32:49.381848 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k2bs6" podStartSLOduration=2.882850815 podStartE2EDuration="4.381815078s" podCreationTimestamp="2025-09-30 05:32:45 +0000 UTC" firstStartedPulling="2025-09-30 05:32:46.280446924 +0000 UTC m=+236.607567449" lastFinishedPulling="2025-09-30 05:32:47.779411187 +0000 UTC m=+238.106531712" observedRunningTime="2025-09-30 05:32:49.380067171 +0000 UTC m=+239.707187716" watchObservedRunningTime="2025-09-30 05:32:49.381815078 +0000 UTC m=+239.708935623" Sep 30 05:32:51 crc kubenswrapper[4956]: I0930 05:32:51.767018 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4fn5g"] Sep 30 05:32:52 crc kubenswrapper[4956]: I0930 05:32:52.818226 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:52 crc kubenswrapper[4956]: I0930 05:32:52.818818 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:52 crc kubenswrapper[4956]: I0930 05:32:52.862597 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:53 crc kubenswrapper[4956]: I0930 05:32:53.043250 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:53 crc kubenswrapper[4956]: I0930 05:32:53.043322 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:53 crc kubenswrapper[4956]: I0930 05:32:53.102272 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:53 crc kubenswrapper[4956]: I0930 05:32:53.429431 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rjpbl" Sep 30 05:32:53 crc kubenswrapper[4956]: I0930 05:32:53.445873 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q9m7s" Sep 30 05:32:55 crc kubenswrapper[4956]: I0930 05:32:55.214637 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:55 crc kubenswrapper[4956]: I0930 05:32:55.214880 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:55 crc kubenswrapper[4956]: I0930 05:32:55.278259 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:55 crc kubenswrapper[4956]: I0930 05:32:55.445304 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:32:55 crc kubenswrapper[4956]: I0930 05:32:55.482012 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:55 crc kubenswrapper[4956]: I0930 05:32:55.482162 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:55 crc kubenswrapper[4956]: I0930 05:32:55.515199 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:32:56 crc kubenswrapper[4956]: I0930 05:32:56.445967 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 05:33:16 crc kubenswrapper[4956]: I0930 05:33:16.798976 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" podUID="4608fdfb-f63a-4992-889e-3bbf043257b6" containerName="oauth-openshift" containerID="cri-o://e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8" gracePeriod=15 Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.284092 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.316667 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-785f95f6b7-tnwx4"] Sep 30 05:33:17 crc kubenswrapper[4956]: E0930 05:33:17.317026 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4608fdfb-f63a-4992-889e-3bbf043257b6" containerName="oauth-openshift" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.317048 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4608fdfb-f63a-4992-889e-3bbf043257b6" containerName="oauth-openshift" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.317246 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4608fdfb-f63a-4992-889e-3bbf043257b6" containerName="oauth-openshift" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.317853 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.329770 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-785f95f6b7-tnwx4"] Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.396765 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkwbs\" (UniqueName: \"kubernetes.io/projected/4608fdfb-f63a-4992-889e-3bbf043257b6-kube-api-access-gkwbs\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.396822 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-ocp-branding-template\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.396862 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-dir\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.396911 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-policies\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.396946 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-service-ca\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.396996 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-serving-cert\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397021 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-cliconfig\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397047 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-error\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397095 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-router-certs\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397151 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-login\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397181 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-idp-0-file-data\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397210 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-trusted-ca-bundle\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397238 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-provider-selection\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397261 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-session\") pod \"4608fdfb-f63a-4992-889e-3bbf043257b6\" (UID: \"4608fdfb-f63a-4992-889e-3bbf043257b6\") " Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397397 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-router-certs\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397431 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-audit-policies\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397456 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397486 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397514 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397535 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397556 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/691563a6-65a3-4a32-9236-bc77928f08d9-audit-dir\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397608 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-service-ca\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397636 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmhqw\" (UniqueName: \"kubernetes.io/projected/691563a6-65a3-4a32-9236-bc77928f08d9-kube-api-access-bmhqw\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397672 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-login\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397702 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-error\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397738 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397779 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.397800 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-session\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.398184 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.399421 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.399502 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.399748 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.400053 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.403938 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.404536 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.405045 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.405416 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4608fdfb-f63a-4992-889e-3bbf043257b6-kube-api-access-gkwbs" (OuterVolumeSpecName: "kube-api-access-gkwbs") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "kube-api-access-gkwbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.405463 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.405700 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.405944 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.406535 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.406673 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4608fdfb-f63a-4992-889e-3bbf043257b6" (UID: "4608fdfb-f63a-4992-889e-3bbf043257b6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498719 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-router-certs\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498786 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-audit-policies\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498818 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498863 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498893 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498914 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498941 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/691563a6-65a3-4a32-9236-bc77928f08d9-audit-dir\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498962 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-service-ca\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.498986 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmhqw\" (UniqueName: \"kubernetes.io/projected/691563a6-65a3-4a32-9236-bc77928f08d9-kube-api-access-bmhqw\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499027 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-login\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499049 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-error\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499081 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499163 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499203 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-session\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499267 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499283 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499295 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499308 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499321 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499333 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499346 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499360 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499373 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499385 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkwbs\" (UniqueName: \"kubernetes.io/projected/4608fdfb-f63a-4992-889e-3bbf043257b6-kube-api-access-gkwbs\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499398 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499413 4956 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499425 4956 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.499436 4956 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4608fdfb-f63a-4992-889e-3bbf043257b6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.500290 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-audit-policies\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.500545 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/691563a6-65a3-4a32-9236-bc77928f08d9-audit-dir\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.501311 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.501869 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-service-ca\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.501915 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.504835 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-session\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.505184 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-router-certs\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.505450 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.505620 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-error\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.506823 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.507158 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-login\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.507262 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.507295 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/691563a6-65a3-4a32-9236-bc77928f08d9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.519286 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmhqw\" (UniqueName: \"kubernetes.io/projected/691563a6-65a3-4a32-9236-bc77928f08d9-kube-api-access-bmhqw\") pod \"oauth-openshift-785f95f6b7-tnwx4\" (UID: \"691563a6-65a3-4a32-9236-bc77928f08d9\") " pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.533928 4956 generic.go:334] "Generic (PLEG): container finished" podID="4608fdfb-f63a-4992-889e-3bbf043257b6" containerID="e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8" exitCode=0 Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.533985 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" event={"ID":"4608fdfb-f63a-4992-889e-3bbf043257b6","Type":"ContainerDied","Data":"e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8"} Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.534009 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.534039 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4fn5g" event={"ID":"4608fdfb-f63a-4992-889e-3bbf043257b6","Type":"ContainerDied","Data":"fd772c2879b206f528f3a6f36f5300d7fefe3b729a3fe64e9c8933e3aa9ee160"} Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.534063 4956 scope.go:117] "RemoveContainer" containerID="e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.561260 4956 scope.go:117] "RemoveContainer" containerID="e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8" Sep 30 05:33:17 crc kubenswrapper[4956]: E0930 05:33:17.566687 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8\": container with ID starting with e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8 not found: ID does not exist" containerID="e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.566748 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8"} err="failed to get container status \"e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8\": rpc error: code = NotFound desc = could not find container \"e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8\": container with ID starting with e7c3ae25e20b6c8cbe3fb0f0439a745b63b0d8dab8a245ab64bbd16c85a55da8 not found: ID does not exist" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.568933 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4fn5g"] Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.581862 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4fn5g"] Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.648013 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:17 crc kubenswrapper[4956]: I0930 05:33:17.833862 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-785f95f6b7-tnwx4"] Sep 30 05:33:18 crc kubenswrapper[4956]: I0930 05:33:18.346823 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4608fdfb-f63a-4992-889e-3bbf043257b6" path="/var/lib/kubelet/pods/4608fdfb-f63a-4992-889e-3bbf043257b6/volumes" Sep 30 05:33:18 crc kubenswrapper[4956]: I0930 05:33:18.541010 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" event={"ID":"691563a6-65a3-4a32-9236-bc77928f08d9","Type":"ContainerStarted","Data":"60ece9e2ed4f8823312666915e1a14a018d55e23621784f441f0d952f61adfac"} Sep 30 05:33:18 crc kubenswrapper[4956]: I0930 05:33:18.541101 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" event={"ID":"691563a6-65a3-4a32-9236-bc77928f08d9","Type":"ContainerStarted","Data":"50a454602c8372b2922561a8a6d53ad1dcdeeca250723c0815cde1d30ec7018e"} Sep 30 05:33:18 crc kubenswrapper[4956]: I0930 05:33:18.541323 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:33:18 crc kubenswrapper[4956]: I0930 05:33:18.567446 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" podStartSLOduration=27.567413661 podStartE2EDuration="27.567413661s" podCreationTimestamp="2025-09-30 05:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:33:18.559693802 +0000 UTC m=+268.886814327" watchObservedRunningTime="2025-09-30 05:33:18.567413661 +0000 UTC m=+268.894534216" Sep 30 05:33:18 crc kubenswrapper[4956]: I0930 05:33:18.703061 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-785f95f6b7-tnwx4" Sep 30 05:34:18 crc kubenswrapper[4956]: I0930 05:34:18.073511 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:34:18 crc kubenswrapper[4956]: I0930 05:34:18.074098 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:34:48 crc kubenswrapper[4956]: I0930 05:34:48.077934 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:34:48 crc kubenswrapper[4956]: I0930 05:34:48.078565 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:35:18 crc kubenswrapper[4956]: I0930 05:35:18.073923 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:35:18 crc kubenswrapper[4956]: I0930 05:35:18.074569 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:35:18 crc kubenswrapper[4956]: I0930 05:35:18.074634 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:35:18 crc kubenswrapper[4956]: I0930 05:35:18.075621 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7ecb2580b031d845f054b28f4a97a97c58fe86efb667130fe849d07f1e2cafa"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 05:35:18 crc kubenswrapper[4956]: I0930 05:35:18.075712 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://f7ecb2580b031d845f054b28f4a97a97c58fe86efb667130fe849d07f1e2cafa" gracePeriod=600 Sep 30 05:35:18 crc kubenswrapper[4956]: I0930 05:35:18.277803 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="f7ecb2580b031d845f054b28f4a97a97c58fe86efb667130fe849d07f1e2cafa" exitCode=0 Sep 30 05:35:18 crc kubenswrapper[4956]: I0930 05:35:18.277882 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"f7ecb2580b031d845f054b28f4a97a97c58fe86efb667130fe849d07f1e2cafa"} Sep 30 05:35:18 crc kubenswrapper[4956]: I0930 05:35:18.277955 4956 scope.go:117] "RemoveContainer" containerID="74ff672a676448ddd79714c7e6366de45489f88ecf93d7aeaffa8c7835cbc6ef" Sep 30 05:35:19 crc kubenswrapper[4956]: I0930 05:35:19.286307 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"bf32bc96b878134a19be7c69abbea20183ca590cbab32abb08f319a6b44d2808"} Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.536061 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-245fk"] Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.537438 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.562719 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-245fk"] Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.708739 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-bound-sa-token\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.708829 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-registry-certificates\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.708930 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4wf2\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-kube-api-access-c4wf2\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.708982 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-trusted-ca\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.709200 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.709321 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.709420 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-registry-tls\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.709653 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.735054 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.811532 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-bound-sa-token\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.811645 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-registry-certificates\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.811697 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4wf2\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-kube-api-access-c4wf2\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.811751 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-trusted-ca\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.811793 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.811852 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.811893 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-registry-tls\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.812876 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.813431 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-registry-certificates\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.814029 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-trusted-ca\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.820655 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-registry-tls\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.820756 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.846583 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-bound-sa-token\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.854293 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4wf2\" (UniqueName: \"kubernetes.io/projected/95cacf2a-6b94-4e03-89bc-11dc1ad64e6f-kube-api-access-c4wf2\") pod \"image-registry-66df7c8f76-245fk\" (UID: \"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f\") " pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:24 crc kubenswrapper[4956]: I0930 05:36:24.858941 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:25 crc kubenswrapper[4956]: I0930 05:36:25.054394 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-245fk"] Sep 30 05:36:25 crc kubenswrapper[4956]: I0930 05:36:25.684746 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-245fk" event={"ID":"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f","Type":"ContainerStarted","Data":"1dc1f905a8abc3f26c4569c70d8874894b93d2dcd19bbeb9cbae2e882d1df3d3"} Sep 30 05:36:25 crc kubenswrapper[4956]: I0930 05:36:25.685031 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-245fk" event={"ID":"95cacf2a-6b94-4e03-89bc-11dc1ad64e6f","Type":"ContainerStarted","Data":"bf377b7669555317d668279cc93bae2f3a441834048a31ae0f7ce40f116736a2"} Sep 30 05:36:25 crc kubenswrapper[4956]: I0930 05:36:25.685048 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:25 crc kubenswrapper[4956]: I0930 05:36:25.701319 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-245fk" podStartSLOduration=1.701304546 podStartE2EDuration="1.701304546s" podCreationTimestamp="2025-09-30 05:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:36:25.699885872 +0000 UTC m=+456.027006397" watchObservedRunningTime="2025-09-30 05:36:25.701304546 +0000 UTC m=+456.028425071" Sep 30 05:36:44 crc kubenswrapper[4956]: I0930 05:36:44.864687 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-245fk" Sep 30 05:36:44 crc kubenswrapper[4956]: I0930 05:36:44.923794 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5kh24"] Sep 30 05:37:09 crc kubenswrapper[4956]: I0930 05:37:09.979576 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" podUID="98c27d5f-42a5-4c1b-b5f4-49dcef583537" containerName="registry" containerID="cri-o://78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617" gracePeriod=30 Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.401351 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.565689 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-certificates\") pod \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.565736 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-trusted-ca\") pod \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.566481 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "98c27d5f-42a5-4c1b-b5f4-49dcef583537" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.565777 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-tls\") pod \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.566645 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.566666 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98c27d5f-42a5-4c1b-b5f4-49dcef583537-ca-trust-extracted\") pod \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.566676 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "98c27d5f-42a5-4c1b-b5f4-49dcef583537" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.566908 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-bound-sa-token\") pod \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.566949 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98c27d5f-42a5-4c1b-b5f4-49dcef583537-installation-pull-secrets\") pod \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.566987 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm6bz\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-kube-api-access-dm6bz\") pod \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\" (UID: \"98c27d5f-42a5-4c1b-b5f4-49dcef583537\") " Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.567637 4956 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.567650 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98c27d5f-42a5-4c1b-b5f4-49dcef583537-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.572296 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c27d5f-42a5-4c1b-b5f4-49dcef583537-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "98c27d5f-42a5-4c1b-b5f4-49dcef583537" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.572884 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "98c27d5f-42a5-4c1b-b5f4-49dcef583537" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.573000 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "98c27d5f-42a5-4c1b-b5f4-49dcef583537" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.577731 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-kube-api-access-dm6bz" (OuterVolumeSpecName: "kube-api-access-dm6bz") pod "98c27d5f-42a5-4c1b-b5f4-49dcef583537" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537"). InnerVolumeSpecName "kube-api-access-dm6bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.577927 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "98c27d5f-42a5-4c1b-b5f4-49dcef583537" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.591382 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c27d5f-42a5-4c1b-b5f4-49dcef583537-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "98c27d5f-42a5-4c1b-b5f4-49dcef583537" (UID: "98c27d5f-42a5-4c1b-b5f4-49dcef583537"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.668221 4956 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.668252 4956 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/98c27d5f-42a5-4c1b-b5f4-49dcef583537-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.668266 4956 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.668278 4956 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/98c27d5f-42a5-4c1b-b5f4-49dcef583537-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.668292 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm6bz\" (UniqueName: \"kubernetes.io/projected/98c27d5f-42a5-4c1b-b5f4-49dcef583537-kube-api-access-dm6bz\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.927835 4956 generic.go:334] "Generic (PLEG): container finished" podID="98c27d5f-42a5-4c1b-b5f4-49dcef583537" containerID="78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617" exitCode=0 Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.927888 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.927884 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" event={"ID":"98c27d5f-42a5-4c1b-b5f4-49dcef583537","Type":"ContainerDied","Data":"78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617"} Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.928067 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5kh24" event={"ID":"98c27d5f-42a5-4c1b-b5f4-49dcef583537","Type":"ContainerDied","Data":"a46d696672ca20a9f156bbc422bac2da96a228600d47bd4165a802152ffd9698"} Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.928086 4956 scope.go:117] "RemoveContainer" containerID="78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.959651 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5kh24"] Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.962819 4956 scope.go:117] "RemoveContainer" containerID="78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617" Sep 30 05:37:10 crc kubenswrapper[4956]: E0930 05:37:10.963253 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617\": container with ID starting with 78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617 not found: ID does not exist" containerID="78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.963284 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617"} err="failed to get container status \"78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617\": rpc error: code = NotFound desc = could not find container \"78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617\": container with ID starting with 78807fc062fd3f9441de44f55e42de5e09a5912c93ceb34ac0ac51ffbf22f617 not found: ID does not exist" Sep 30 05:37:10 crc kubenswrapper[4956]: I0930 05:37:10.965399 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5kh24"] Sep 30 05:37:12 crc kubenswrapper[4956]: I0930 05:37:12.349819 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c27d5f-42a5-4c1b-b5f4-49dcef583537" path="/var/lib/kubelet/pods/98c27d5f-42a5-4c1b-b5f4-49dcef583537/volumes" Sep 30 05:37:18 crc kubenswrapper[4956]: I0930 05:37:18.073413 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:37:18 crc kubenswrapper[4956]: I0930 05:37:18.073912 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.260608 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9jcp8"] Sep 30 05:37:42 crc kubenswrapper[4956]: E0930 05:37:42.261290 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c27d5f-42a5-4c1b-b5f4-49dcef583537" containerName="registry" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.261302 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c27d5f-42a5-4c1b-b5f4-49dcef583537" containerName="registry" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.261392 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c27d5f-42a5-4c1b-b5f4-49dcef583537" containerName="registry" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.261730 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-9jcp8" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.264843 4956 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-x52ps" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.264931 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.264983 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.272339 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-skkpn"] Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.273007 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-skkpn" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.274848 4956 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-stk58" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.276085 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9jcp8"] Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.291573 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cmwsp"] Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.292258 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.295780 4956 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-hzt4p" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.297347 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-skkpn"] Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.309378 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cmwsp"] Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.390727 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5bd\" (UniqueName: \"kubernetes.io/projected/50748964-4222-40d7-a12c-6ab004bf8a77-kube-api-access-hc5bd\") pod \"cert-manager-webhook-5655c58dd6-cmwsp\" (UID: \"50748964-4222-40d7-a12c-6ab004bf8a77\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.390832 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmt4k\" (UniqueName: \"kubernetes.io/projected/528b17e7-a42f-4e5e-8731-4f3d84d59cf7-kube-api-access-bmt4k\") pod \"cert-manager-5b446d88c5-skkpn\" (UID: \"528b17e7-a42f-4e5e-8731-4f3d84d59cf7\") " pod="cert-manager/cert-manager-5b446d88c5-skkpn" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.390908 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bth4g\" (UniqueName: \"kubernetes.io/projected/5384e744-0e0a-4743-bf15-cb75c35951ac-kube-api-access-bth4g\") pod \"cert-manager-cainjector-7f985d654d-9jcp8\" (UID: \"5384e744-0e0a-4743-bf15-cb75c35951ac\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9jcp8" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.492221 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bth4g\" (UniqueName: \"kubernetes.io/projected/5384e744-0e0a-4743-bf15-cb75c35951ac-kube-api-access-bth4g\") pod \"cert-manager-cainjector-7f985d654d-9jcp8\" (UID: \"5384e744-0e0a-4743-bf15-cb75c35951ac\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9jcp8" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.492313 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5bd\" (UniqueName: \"kubernetes.io/projected/50748964-4222-40d7-a12c-6ab004bf8a77-kube-api-access-hc5bd\") pod \"cert-manager-webhook-5655c58dd6-cmwsp\" (UID: \"50748964-4222-40d7-a12c-6ab004bf8a77\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.492339 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmt4k\" (UniqueName: \"kubernetes.io/projected/528b17e7-a42f-4e5e-8731-4f3d84d59cf7-kube-api-access-bmt4k\") pod \"cert-manager-5b446d88c5-skkpn\" (UID: \"528b17e7-a42f-4e5e-8731-4f3d84d59cf7\") " pod="cert-manager/cert-manager-5b446d88c5-skkpn" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.510134 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmt4k\" (UniqueName: \"kubernetes.io/projected/528b17e7-a42f-4e5e-8731-4f3d84d59cf7-kube-api-access-bmt4k\") pod \"cert-manager-5b446d88c5-skkpn\" (UID: \"528b17e7-a42f-4e5e-8731-4f3d84d59cf7\") " pod="cert-manager/cert-manager-5b446d88c5-skkpn" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.510829 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5bd\" (UniqueName: \"kubernetes.io/projected/50748964-4222-40d7-a12c-6ab004bf8a77-kube-api-access-hc5bd\") pod \"cert-manager-webhook-5655c58dd6-cmwsp\" (UID: \"50748964-4222-40d7-a12c-6ab004bf8a77\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.511023 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bth4g\" (UniqueName: \"kubernetes.io/projected/5384e744-0e0a-4743-bf15-cb75c35951ac-kube-api-access-bth4g\") pod \"cert-manager-cainjector-7f985d654d-9jcp8\" (UID: \"5384e744-0e0a-4743-bf15-cb75c35951ac\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-9jcp8" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.581836 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-9jcp8" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.596996 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-skkpn" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.608093 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.849182 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cmwsp"] Sep 30 05:37:42 crc kubenswrapper[4956]: W0930 05:37:42.859999 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50748964_4222_40d7_a12c_6ab004bf8a77.slice/crio-7f497a08d600ea14989f26e0d1d7c78cfa2dbb0cecd096ca5c1eb529a4ca7820 WatchSource:0}: Error finding container 7f497a08d600ea14989f26e0d1d7c78cfa2dbb0cecd096ca5c1eb529a4ca7820: Status 404 returned error can't find the container with id 7f497a08d600ea14989f26e0d1d7c78cfa2dbb0cecd096ca5c1eb529a4ca7820 Sep 30 05:37:42 crc kubenswrapper[4956]: I0930 05:37:42.862735 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 05:37:43 crc kubenswrapper[4956]: I0930 05:37:43.008031 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-9jcp8"] Sep 30 05:37:43 crc kubenswrapper[4956]: I0930 05:37:43.011352 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-skkpn"] Sep 30 05:37:43 crc kubenswrapper[4956]: I0930 05:37:43.148410 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-skkpn" event={"ID":"528b17e7-a42f-4e5e-8731-4f3d84d59cf7","Type":"ContainerStarted","Data":"cdcda596030dbf0878dd0b7eb404d95d6588fe31f3c4430f046ea3809817b495"} Sep 30 05:37:43 crc kubenswrapper[4956]: I0930 05:37:43.149856 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" event={"ID":"50748964-4222-40d7-a12c-6ab004bf8a77","Type":"ContainerStarted","Data":"7f497a08d600ea14989f26e0d1d7c78cfa2dbb0cecd096ca5c1eb529a4ca7820"} Sep 30 05:37:43 crc kubenswrapper[4956]: I0930 05:37:43.150878 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-9jcp8" event={"ID":"5384e744-0e0a-4743-bf15-cb75c35951ac","Type":"ContainerStarted","Data":"fac4056a1410b195e9ce93a9d75633f1f4ce1663091b0067881b110eac609001"} Sep 30 05:37:47 crc kubenswrapper[4956]: I0930 05:37:47.174483 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-skkpn" event={"ID":"528b17e7-a42f-4e5e-8731-4f3d84d59cf7","Type":"ContainerStarted","Data":"7e15f8c9f48a471b9f804c31149ca58293d8dc441442678ac4bbcb28cbad8cf3"} Sep 30 05:37:47 crc kubenswrapper[4956]: I0930 05:37:47.178785 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" event={"ID":"50748964-4222-40d7-a12c-6ab004bf8a77","Type":"ContainerStarted","Data":"381e6c75c54d655ac293924275bd412400be004ce13edb9a0e13a09068c228ae"} Sep 30 05:37:47 crc kubenswrapper[4956]: I0930 05:37:47.179276 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" Sep 30 05:37:47 crc kubenswrapper[4956]: I0930 05:37:47.180315 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-9jcp8" event={"ID":"5384e744-0e0a-4743-bf15-cb75c35951ac","Type":"ContainerStarted","Data":"eaa0f2ac612b319f6c45d4f10130f04cdd3ccb2c38ec94283b847145b59723f1"} Sep 30 05:37:47 crc kubenswrapper[4956]: I0930 05:37:47.192000 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-skkpn" podStartSLOduration=2.046400383 podStartE2EDuration="5.191976578s" podCreationTimestamp="2025-09-30 05:37:42 +0000 UTC" firstStartedPulling="2025-09-30 05:37:43.014712022 +0000 UTC m=+533.341832547" lastFinishedPulling="2025-09-30 05:37:46.160288217 +0000 UTC m=+536.487408742" observedRunningTime="2025-09-30 05:37:47.188769545 +0000 UTC m=+537.515890080" watchObservedRunningTime="2025-09-30 05:37:47.191976578 +0000 UTC m=+537.519097143" Sep 30 05:37:47 crc kubenswrapper[4956]: I0930 05:37:47.216463 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-9jcp8" podStartSLOduration=2.080461444 podStartE2EDuration="5.216441989s" podCreationTimestamp="2025-09-30 05:37:42 +0000 UTC" firstStartedPulling="2025-09-30 05:37:43.016604113 +0000 UTC m=+533.343724658" lastFinishedPulling="2025-09-30 05:37:46.152584678 +0000 UTC m=+536.479705203" observedRunningTime="2025-09-30 05:37:47.212706858 +0000 UTC m=+537.539827423" watchObservedRunningTime="2025-09-30 05:37:47.216441989 +0000 UTC m=+537.543562544" Sep 30 05:37:47 crc kubenswrapper[4956]: I0930 05:37:47.237015 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" podStartSLOduration=1.957552003 podStartE2EDuration="5.236988563s" podCreationTimestamp="2025-09-30 05:37:42 +0000 UTC" firstStartedPulling="2025-09-30 05:37:42.862537855 +0000 UTC m=+533.189658380" lastFinishedPulling="2025-09-30 05:37:46.141974415 +0000 UTC m=+536.469094940" observedRunningTime="2025-09-30 05:37:47.234201232 +0000 UTC m=+537.561321797" watchObservedRunningTime="2025-09-30 05:37:47.236988563 +0000 UTC m=+537.564109118" Sep 30 05:37:48 crc kubenswrapper[4956]: I0930 05:37:48.075321 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:37:48 crc kubenswrapper[4956]: I0930 05:37:48.075459 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.611208 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-cmwsp" Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.875383 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j8sw2"] Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.876070 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="nbdb" containerID="cri-o://f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef" gracePeriod=30 Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.876378 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="northd" containerID="cri-o://aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1" gracePeriod=30 Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.876526 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="sbdb" containerID="cri-o://7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10" gracePeriod=30 Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.875994 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovn-controller" containerID="cri-o://ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360" gracePeriod=30 Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.876613 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29" gracePeriod=30 Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.876635 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kube-rbac-proxy-node" containerID="cri-o://6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b" gracePeriod=30 Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.876622 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovn-acl-logging" containerID="cri-o://f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125" gracePeriod=30 Sep 30 05:37:52 crc kubenswrapper[4956]: I0930 05:37:52.911842 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" containerID="cri-o://60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" gracePeriod=30 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.137891 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/3.log" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.141458 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovn-acl-logging/0.log" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.142035 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovn-controller/0.log" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.142505 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185536 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29mm8"] Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185766 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185787 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185798 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kubecfg-setup" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185805 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kubecfg-setup" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185817 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovn-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185825 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovn-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185834 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovn-acl-logging" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185842 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovn-acl-logging" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185851 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="sbdb" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185858 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="sbdb" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185870 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185879 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185890 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="nbdb" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185898 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="nbdb" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185909 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185917 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185927 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185935 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185958 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kube-rbac-proxy-node" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185966 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kube-rbac-proxy-node" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.185983 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="northd" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.185990 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="northd" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186098 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186123 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186132 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186140 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186166 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kube-rbac-proxy-node" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186188 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="sbdb" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186198 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovn-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186207 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186218 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="nbdb" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186226 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovn-acl-logging" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186234 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="northd" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.186370 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186383 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.186400 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186410 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.186524 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" containerName="ovnkube-controller" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.188559 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.217352 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovnkube-controller/3.log" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219011 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovn-acl-logging/0.log" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219417 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8sw2_29df1c73-1262-4143-b710-bc690edc2ab8/ovn-controller/0.log" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219810 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" exitCode=0 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219834 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10" exitCode=0 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219843 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef" exitCode=0 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219853 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1" exitCode=0 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219861 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29" exitCode=0 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219869 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b" exitCode=0 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219877 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125" exitCode=143 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219888 4956 generic.go:334] "Generic (PLEG): container finished" podID="29df1c73-1262-4143-b710-bc690edc2ab8" containerID="ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360" exitCode=143 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219875 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219916 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219940 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219956 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219968 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219980 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.219992 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220008 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220021 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220027 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220023 4956 scope.go:117] "RemoveContainer" containerID="60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220033 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220188 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220195 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220201 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220206 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220212 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220221 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220231 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220238 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220243 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220249 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220254 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220259 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220264 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220269 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220274 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220279 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220287 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220294 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220301 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220306 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220315 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220350 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220356 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220361 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220366 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220371 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220376 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220383 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8sw2" event={"ID":"29df1c73-1262-4143-b710-bc690edc2ab8","Type":"ContainerDied","Data":"381f02b5a6d726d256e21bc3cf089c46017fbf7dcfd333bb93bdb6b36b240074"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220394 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220401 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220407 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220413 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220418 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220424 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220429 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220434 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220439 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.220444 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.223122 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/2.log" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.223944 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/1.log" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.223995 4956 generic.go:334] "Generic (PLEG): container finished" podID="72ad9902-843c-4117-9ac1-c34d525c9d55" containerID="910f9a9e20921fea2d407aca2be189b0e75ce142086dfbbea368233848f74b83" exitCode=2 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.224030 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-frfx9" event={"ID":"72ad9902-843c-4117-9ac1-c34d525c9d55","Type":"ContainerDied","Data":"910f9a9e20921fea2d407aca2be189b0e75ce142086dfbbea368233848f74b83"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.224057 4956 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119"} Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.224609 4956 scope.go:117] "RemoveContainer" containerID="910f9a9e20921fea2d407aca2be189b0e75ce142086dfbbea368233848f74b83" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.224922 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-frfx9_openshift-multus(72ad9902-843c-4117-9ac1-c34d525c9d55)\"" pod="openshift-multus/multus-frfx9" podUID="72ad9902-843c-4117-9ac1-c34d525c9d55" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.236740 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.255822 4956 scope.go:117] "RemoveContainer" containerID="7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.268887 4956 scope.go:117] "RemoveContainer" containerID="f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.278514 4956 scope.go:117] "RemoveContainer" containerID="aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.289030 4956 scope.go:117] "RemoveContainer" containerID="13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.298614 4956 scope.go:117] "RemoveContainer" containerID="6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.307955 4956 scope.go:117] "RemoveContainer" containerID="f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.317772 4956 scope.go:117] "RemoveContainer" containerID="ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323158 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-openvswitch\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323200 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-log-socket\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323221 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-netns\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323238 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-ovn-kubernetes\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323255 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-etc-openvswitch\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323273 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-var-lib-openvswitch\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323301 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-log-socket" (OuterVolumeSpecName: "log-socket") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323321 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323345 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323344 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323358 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323353 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323374 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-env-overrides\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323398 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5xxz\" (UniqueName: \"kubernetes.io/projected/29df1c73-1262-4143-b710-bc690edc2ab8-kube-api-access-p5xxz\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323413 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-kubelet\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323849 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-config\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323870 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-node-log\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323566 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.323803 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324140 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-node-log" (OuterVolumeSpecName: "node-log") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324333 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324385 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-systemd-units\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324404 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-slash\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324459 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324489 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29df1c73-1262-4143-b710-bc690edc2ab8-ovn-node-metrics-cert\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324506 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-bin\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324515 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-slash" (OuterVolumeSpecName: "host-slash") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324521 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324562 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324576 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324589 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-script-lib\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324648 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-netd\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324677 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-ovn\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324707 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-systemd\") pod \"29df1c73-1262-4143-b710-bc690edc2ab8\" (UID: \"29df1c73-1262-4143-b710-bc690edc2ab8\") " Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324740 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324767 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324886 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-systemd\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324909 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-cni-bin\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.324942 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjpnl\" (UniqueName: \"kubernetes.io/projected/e5b10d87-fe01-40c9-8819-b3af7968f698-kube-api-access-xjpnl\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325027 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325170 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5b10d87-fe01-40c9-8819-b3af7968f698-ovn-node-metrics-cert\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325207 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-slash\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325223 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-run-ovn-kubernetes\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325242 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-ovnkube-script-lib\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325258 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-kubelet\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325282 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-run-netns\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325310 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-log-socket\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325327 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325360 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-var-lib-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325381 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-env-overrides\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325395 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-cni-netd\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325412 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-etc-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325431 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325477 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-systemd-units\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325516 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-ovn\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325567 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-node-log\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325589 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-ovnkube-config\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325629 4956 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325737 4956 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325788 4956 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325806 4956 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325817 4956 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325828 4956 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325841 4956 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325852 4956 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325863 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325874 4956 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325884 4956 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325894 4956 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325905 4956 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325917 4956 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325928 4956 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29df1c73-1262-4143-b710-bc690edc2ab8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325939 4956 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.325949 4956 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.328582 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29df1c73-1262-4143-b710-bc690edc2ab8-kube-api-access-p5xxz" (OuterVolumeSpecName: "kube-api-access-p5xxz") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "kube-api-access-p5xxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.328793 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29df1c73-1262-4143-b710-bc690edc2ab8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.329953 4956 scope.go:117] "RemoveContainer" containerID="05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.336368 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "29df1c73-1262-4143-b710-bc690edc2ab8" (UID: "29df1c73-1262-4143-b710-bc690edc2ab8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.342073 4956 scope.go:117] "RemoveContainer" containerID="60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.342374 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": container with ID starting with 60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3 not found: ID does not exist" containerID="60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.342405 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} err="failed to get container status \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": rpc error: code = NotFound desc = could not find container \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": container with ID starting with 60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.342424 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.342652 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": container with ID starting with f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da not found: ID does not exist" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.342677 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} err="failed to get container status \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": rpc error: code = NotFound desc = could not find container \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": container with ID starting with f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.342692 4956 scope.go:117] "RemoveContainer" containerID="7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.342866 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": container with ID starting with 7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10 not found: ID does not exist" containerID="7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.342888 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} err="failed to get container status \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": rpc error: code = NotFound desc = could not find container \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": container with ID starting with 7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.342902 4956 scope.go:117] "RemoveContainer" containerID="f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.343276 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": container with ID starting with f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef not found: ID does not exist" containerID="f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.343300 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} err="failed to get container status \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": rpc error: code = NotFound desc = could not find container \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": container with ID starting with f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.343313 4956 scope.go:117] "RemoveContainer" containerID="aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.343580 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": container with ID starting with aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1 not found: ID does not exist" containerID="aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.343623 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} err="failed to get container status \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": rpc error: code = NotFound desc = could not find container \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": container with ID starting with aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.343651 4956 scope.go:117] "RemoveContainer" containerID="13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.343909 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": container with ID starting with 13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29 not found: ID does not exist" containerID="13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.343952 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} err="failed to get container status \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": rpc error: code = NotFound desc = could not find container \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": container with ID starting with 13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.343994 4956 scope.go:117] "RemoveContainer" containerID="6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.344288 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": container with ID starting with 6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b not found: ID does not exist" containerID="6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.344308 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} err="failed to get container status \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": rpc error: code = NotFound desc = could not find container \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": container with ID starting with 6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.344323 4956 scope.go:117] "RemoveContainer" containerID="f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.344561 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": container with ID starting with f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125 not found: ID does not exist" containerID="f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.344580 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} err="failed to get container status \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": rpc error: code = NotFound desc = could not find container \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": container with ID starting with f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.344592 4956 scope.go:117] "RemoveContainer" containerID="ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.344815 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": container with ID starting with ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360 not found: ID does not exist" containerID="ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.344832 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} err="failed to get container status \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": rpc error: code = NotFound desc = could not find container \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": container with ID starting with ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.344843 4956 scope.go:117] "RemoveContainer" containerID="05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f" Sep 30 05:37:53 crc kubenswrapper[4956]: E0930 05:37:53.345165 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": container with ID starting with 05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f not found: ID does not exist" containerID="05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.345192 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} err="failed to get container status \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": rpc error: code = NotFound desc = could not find container \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": container with ID starting with 05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.345204 4956 scope.go:117] "RemoveContainer" containerID="60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.345430 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} err="failed to get container status \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": rpc error: code = NotFound desc = could not find container \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": container with ID starting with 60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.345453 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.345700 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} err="failed to get container status \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": rpc error: code = NotFound desc = could not find container \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": container with ID starting with f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.345717 4956 scope.go:117] "RemoveContainer" containerID="7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.345918 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} err="failed to get container status \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": rpc error: code = NotFound desc = could not find container \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": container with ID starting with 7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.345936 4956 scope.go:117] "RemoveContainer" containerID="f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.346139 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} err="failed to get container status \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": rpc error: code = NotFound desc = could not find container \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": container with ID starting with f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.346177 4956 scope.go:117] "RemoveContainer" containerID="aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.346429 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} err="failed to get container status \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": rpc error: code = NotFound desc = could not find container \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": container with ID starting with aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.346447 4956 scope.go:117] "RemoveContainer" containerID="13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.346665 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} err="failed to get container status \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": rpc error: code = NotFound desc = could not find container \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": container with ID starting with 13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.346682 4956 scope.go:117] "RemoveContainer" containerID="6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.346926 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} err="failed to get container status \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": rpc error: code = NotFound desc = could not find container \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": container with ID starting with 6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.346943 4956 scope.go:117] "RemoveContainer" containerID="f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.347252 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} err="failed to get container status \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": rpc error: code = NotFound desc = could not find container \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": container with ID starting with f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.347269 4956 scope.go:117] "RemoveContainer" containerID="ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.347459 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} err="failed to get container status \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": rpc error: code = NotFound desc = could not find container \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": container with ID starting with ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.347480 4956 scope.go:117] "RemoveContainer" containerID="05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.347692 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} err="failed to get container status \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": rpc error: code = NotFound desc = could not find container \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": container with ID starting with 05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.347708 4956 scope.go:117] "RemoveContainer" containerID="60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.347901 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} err="failed to get container status \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": rpc error: code = NotFound desc = could not find container \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": container with ID starting with 60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.347920 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.348167 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} err="failed to get container status \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": rpc error: code = NotFound desc = could not find container \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": container with ID starting with f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.348185 4956 scope.go:117] "RemoveContainer" containerID="7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.348377 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} err="failed to get container status \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": rpc error: code = NotFound desc = could not find container \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": container with ID starting with 7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.348396 4956 scope.go:117] "RemoveContainer" containerID="f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.348617 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} err="failed to get container status \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": rpc error: code = NotFound desc = could not find container \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": container with ID starting with f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.348632 4956 scope.go:117] "RemoveContainer" containerID="aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.348820 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} err="failed to get container status \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": rpc error: code = NotFound desc = could not find container \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": container with ID starting with aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.348836 4956 scope.go:117] "RemoveContainer" containerID="13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349037 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} err="failed to get container status \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": rpc error: code = NotFound desc = could not find container \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": container with ID starting with 13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349052 4956 scope.go:117] "RemoveContainer" containerID="6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349267 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} err="failed to get container status \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": rpc error: code = NotFound desc = could not find container \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": container with ID starting with 6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349286 4956 scope.go:117] "RemoveContainer" containerID="f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349496 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} err="failed to get container status \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": rpc error: code = NotFound desc = could not find container \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": container with ID starting with f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349523 4956 scope.go:117] "RemoveContainer" containerID="ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349741 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} err="failed to get container status \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": rpc error: code = NotFound desc = could not find container \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": container with ID starting with ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349757 4956 scope.go:117] "RemoveContainer" containerID="05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349973 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} err="failed to get container status \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": rpc error: code = NotFound desc = could not find container \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": container with ID starting with 05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.349992 4956 scope.go:117] "RemoveContainer" containerID="60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.350299 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} err="failed to get container status \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": rpc error: code = NotFound desc = could not find container \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": container with ID starting with 60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.350315 4956 scope.go:117] "RemoveContainer" containerID="f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.350528 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da"} err="failed to get container status \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": rpc error: code = NotFound desc = could not find container \"f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da\": container with ID starting with f4a73ea3c01bd53b6de4d123f7a6857101c7c0ce0ac3713ca616159da22b18da not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.350545 4956 scope.go:117] "RemoveContainer" containerID="7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.350725 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10"} err="failed to get container status \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": rpc error: code = NotFound desc = could not find container \"7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10\": container with ID starting with 7b44b9a633598889b62a4d011cfe56cca45043f0005778ea76479a56e361ad10 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.350748 4956 scope.go:117] "RemoveContainer" containerID="f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.350965 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef"} err="failed to get container status \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": rpc error: code = NotFound desc = could not find container \"f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef\": container with ID starting with f0cebf0d931479c4259a98fff03c3ef10ce979766a02dd290283169af0b989ef not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.350980 4956 scope.go:117] "RemoveContainer" containerID="aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.351168 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1"} err="failed to get container status \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": rpc error: code = NotFound desc = could not find container \"aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1\": container with ID starting with aa2ffc14a4aeda9d1237db8ce32dbca2f22a1629da0db47b91c35cc95b2b7aa1 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.351183 4956 scope.go:117] "RemoveContainer" containerID="13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.351385 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29"} err="failed to get container status \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": rpc error: code = NotFound desc = could not find container \"13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29\": container with ID starting with 13618f0f6fb52494189154391baf24d83ccc958edf62f57684f473cc78d58f29 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.351400 4956 scope.go:117] "RemoveContainer" containerID="6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.351593 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b"} err="failed to get container status \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": rpc error: code = NotFound desc = could not find container \"6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b\": container with ID starting with 6f2a4da222edaf36148cb03243a8c804d7182afcd797bfa6c646346d34757b2b not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.351609 4956 scope.go:117] "RemoveContainer" containerID="f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.351873 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125"} err="failed to get container status \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": rpc error: code = NotFound desc = could not find container \"f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125\": container with ID starting with f281fd99507a57fb8d6910db58389411d79194b10b63ac3c2325442cf6ffa125 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.351888 4956 scope.go:117] "RemoveContainer" containerID="ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.352049 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360"} err="failed to get container status \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": rpc error: code = NotFound desc = could not find container \"ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360\": container with ID starting with ffc4dab6fe306d169093b7ac65717c1835e5d3b4a002c8ec2822889ecdb63360 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.352065 4956 scope.go:117] "RemoveContainer" containerID="05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.352268 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f"} err="failed to get container status \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": rpc error: code = NotFound desc = could not find container \"05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f\": container with ID starting with 05059544f05d99b60a1d0ac565f633c9ae3981b7f496a29020716ad660f31a7f not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.352293 4956 scope.go:117] "RemoveContainer" containerID="60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.352542 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3"} err="failed to get container status \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": rpc error: code = NotFound desc = could not find container \"60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3\": container with ID starting with 60680fb6c4f4f4ff08aba18afe0837388c81a700b4090228f8ebb0edbc9771c3 not found: ID does not exist" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.426935 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-systemd\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.426979 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-cni-bin\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.426999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjpnl\" (UniqueName: \"kubernetes.io/projected/e5b10d87-fe01-40c9-8819-b3af7968f698-kube-api-access-xjpnl\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427025 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5b10d87-fe01-40c9-8819-b3af7968f698-ovn-node-metrics-cert\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427049 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-slash\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427069 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-run-ovn-kubernetes\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427085 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-ovnkube-script-lib\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427102 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-systemd\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427178 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-slash\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427211 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-run-ovn-kubernetes\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427233 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-kubelet\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427250 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-run-netns\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427269 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-log-socket\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427279 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-kubelet\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427306 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-run-netns\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427332 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427361 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-var-lib-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427390 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-log-socket\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427395 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427433 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-var-lib-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427456 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-cni-netd\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427491 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-cni-netd\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427527 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-env-overrides\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427573 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-etc-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427599 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-etc-openvswitch\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427616 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427641 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-systemd-units\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427688 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-ovn\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427708 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-systemd-units\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427714 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-node-log\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427713 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427733 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-ovnkube-config\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427742 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-run-ovn\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427744 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-node-log\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427770 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5b10d87-fe01-40c9-8819-b3af7968f698-host-cni-bin\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427809 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5xxz\" (UniqueName: \"kubernetes.io/projected/29df1c73-1262-4143-b710-bc690edc2ab8-kube-api-access-p5xxz\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427821 4956 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29df1c73-1262-4143-b710-bc690edc2ab8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.427830 4956 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29df1c73-1262-4143-b710-bc690edc2ab8-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.428166 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-env-overrides\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.428253 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-ovnkube-script-lib\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.428428 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5b10d87-fe01-40c9-8819-b3af7968f698-ovnkube-config\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.430440 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5b10d87-fe01-40c9-8819-b3af7968f698-ovn-node-metrics-cert\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.442309 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjpnl\" (UniqueName: \"kubernetes.io/projected/e5b10d87-fe01-40c9-8819-b3af7968f698-kube-api-access-xjpnl\") pod \"ovnkube-node-29mm8\" (UID: \"e5b10d87-fe01-40c9-8819-b3af7968f698\") " pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.502245 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:53 crc kubenswrapper[4956]: W0930 05:37:53.519564 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5b10d87_fe01_40c9_8819_b3af7968f698.slice/crio-5c5cf25f32f5675731d996dbc79c83f9ffbaf320091cff500d832b493ccfdc56 WatchSource:0}: Error finding container 5c5cf25f32f5675731d996dbc79c83f9ffbaf320091cff500d832b493ccfdc56: Status 404 returned error can't find the container with id 5c5cf25f32f5675731d996dbc79c83f9ffbaf320091cff500d832b493ccfdc56 Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.562149 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j8sw2"] Sep 30 05:37:53 crc kubenswrapper[4956]: I0930 05:37:53.567475 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j8sw2"] Sep 30 05:37:54 crc kubenswrapper[4956]: I0930 05:37:54.231109 4956 generic.go:334] "Generic (PLEG): container finished" podID="e5b10d87-fe01-40c9-8819-b3af7968f698" containerID="5d0bfbbd7cbd9e1aabec27dabaed42774ba9c4b76a9e21a2d93057a9efbf74d1" exitCode=0 Sep 30 05:37:54 crc kubenswrapper[4956]: I0930 05:37:54.231219 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerDied","Data":"5d0bfbbd7cbd9e1aabec27dabaed42774ba9c4b76a9e21a2d93057a9efbf74d1"} Sep 30 05:37:54 crc kubenswrapper[4956]: I0930 05:37:54.231251 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"5c5cf25f32f5675731d996dbc79c83f9ffbaf320091cff500d832b493ccfdc56"} Sep 30 05:37:54 crc kubenswrapper[4956]: I0930 05:37:54.352641 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29df1c73-1262-4143-b710-bc690edc2ab8" path="/var/lib/kubelet/pods/29df1c73-1262-4143-b710-bc690edc2ab8/volumes" Sep 30 05:37:55 crc kubenswrapper[4956]: I0930 05:37:55.259363 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"e41d318e94c6eb7c28fad82576f27a3b64ca52ba3729fb3e709845da15b25829"} Sep 30 05:37:55 crc kubenswrapper[4956]: I0930 05:37:55.259661 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"31f0ef4af558c2387f0afd6a006621d55481718ee89b655dcf3bb0ad9c3d42cc"} Sep 30 05:37:55 crc kubenswrapper[4956]: I0930 05:37:55.259678 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"c3d53023fb4ba7f2a253e70a0adc5feb8d7df897fdeac9edf87458f12ca4537a"} Sep 30 05:37:55 crc kubenswrapper[4956]: I0930 05:37:55.259692 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"f00f8330dd8428fa2a106c92ad87ab02fca1f4b06b9d56789480cadb695e158c"} Sep 30 05:37:55 crc kubenswrapper[4956]: I0930 05:37:55.259704 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"de0ae7858b577660e57e75589bc54f554293aa5991fee06504e729e59d5ace23"} Sep 30 05:37:55 crc kubenswrapper[4956]: I0930 05:37:55.259716 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"851661db9b156ef76017ab31f9e19d2bc8f9707c8ffe18029da3fd35b5c5e6f8"} Sep 30 05:37:57 crc kubenswrapper[4956]: I0930 05:37:57.275508 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"0f8e5e4493086317775e28d4de97314beb5d79a8f5007f89061c0106e7105d3e"} Sep 30 05:37:59 crc kubenswrapper[4956]: I0930 05:37:59.290160 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" event={"ID":"e5b10d87-fe01-40c9-8819-b3af7968f698","Type":"ContainerStarted","Data":"8edc98855efe2d25e3201b559d6a5670fbf7733b0513ecedf0a7f47838eeb5c1"} Sep 30 05:37:59 crc kubenswrapper[4956]: I0930 05:37:59.290361 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:59 crc kubenswrapper[4956]: I0930 05:37:59.290441 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:59 crc kubenswrapper[4956]: I0930 05:37:59.290453 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:59 crc kubenswrapper[4956]: I0930 05:37:59.319799 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" podStartSLOduration=6.319782061 podStartE2EDuration="6.319782061s" podCreationTimestamp="2025-09-30 05:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:37:59.317257319 +0000 UTC m=+549.644377854" watchObservedRunningTime="2025-09-30 05:37:59.319782061 +0000 UTC m=+549.646902586" Sep 30 05:37:59 crc kubenswrapper[4956]: I0930 05:37:59.322791 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:37:59 crc kubenswrapper[4956]: I0930 05:37:59.326326 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:38:05 crc kubenswrapper[4956]: I0930 05:38:05.341006 4956 scope.go:117] "RemoveContainer" containerID="910f9a9e20921fea2d407aca2be189b0e75ce142086dfbbea368233848f74b83" Sep 30 05:38:05 crc kubenswrapper[4956]: E0930 05:38:05.342078 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-frfx9_openshift-multus(72ad9902-843c-4117-9ac1-c34d525c9d55)\"" pod="openshift-multus/multus-frfx9" podUID="72ad9902-843c-4117-9ac1-c34d525c9d55" Sep 30 05:38:17 crc kubenswrapper[4956]: I0930 05:38:17.340442 4956 scope.go:117] "RemoveContainer" containerID="910f9a9e20921fea2d407aca2be189b0e75ce142086dfbbea368233848f74b83" Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.073829 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.074329 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.074405 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.075320 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf32bc96b878134a19be7c69abbea20183ca590cbab32abb08f319a6b44d2808"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.075444 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://bf32bc96b878134a19be7c69abbea20183ca590cbab32abb08f319a6b44d2808" gracePeriod=600 Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.398873 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="bf32bc96b878134a19be7c69abbea20183ca590cbab32abb08f319a6b44d2808" exitCode=0 Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.398986 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"bf32bc96b878134a19be7c69abbea20183ca590cbab32abb08f319a6b44d2808"} Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.399291 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"cb7b5906427c53280d2cfba1bc232da8c4f2d2136d0c44bbfff91603d668f7d8"} Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.399329 4956 scope.go:117] "RemoveContainer" containerID="f7ecb2580b031d845f054b28f4a97a97c58fe86efb667130fe849d07f1e2cafa" Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.402374 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/2.log" Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.403325 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/1.log" Sep 30 05:38:18 crc kubenswrapper[4956]: I0930 05:38:18.403417 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-frfx9" event={"ID":"72ad9902-843c-4117-9ac1-c34d525c9d55","Type":"ContainerStarted","Data":"bf7846dfb6616e85a119132435d29d596ccca3e39bdba9b03707eaf9acc68db7"} Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.279006 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k"] Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.280682 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.282574 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.294999 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k"] Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.462913 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.463056 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6ps\" (UniqueName: \"kubernetes.io/projected/50a2fa9e-1d17-41de-af81-c06f0afdf170-kube-api-access-vg6ps\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.463189 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.564454 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6ps\" (UniqueName: \"kubernetes.io/projected/50a2fa9e-1d17-41de-af81-c06f0afdf170-kube-api-access-vg6ps\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.564508 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.564563 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.564951 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.565308 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.586420 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6ps\" (UniqueName: \"kubernetes.io/projected/50a2fa9e-1d17-41de-af81-c06f0afdf170-kube-api-access-vg6ps\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.596596 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:20 crc kubenswrapper[4956]: I0930 05:38:20.827528 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k"] Sep 30 05:38:20 crc kubenswrapper[4956]: W0930 05:38:20.832866 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50a2fa9e_1d17_41de_af81_c06f0afdf170.slice/crio-9324c4f904e4719ae73208d4a8de0212a19f1b7364976f89bf1dad2c0322b494 WatchSource:0}: Error finding container 9324c4f904e4719ae73208d4a8de0212a19f1b7364976f89bf1dad2c0322b494: Status 404 returned error can't find the container with id 9324c4f904e4719ae73208d4a8de0212a19f1b7364976f89bf1dad2c0322b494 Sep 30 05:38:21 crc kubenswrapper[4956]: I0930 05:38:21.426470 4956 generic.go:334] "Generic (PLEG): container finished" podID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerID="36de7c19329f7e1e7a3a3d12371935598c6168e867fced7f7f93c0b2af19e222" exitCode=0 Sep 30 05:38:21 crc kubenswrapper[4956]: I0930 05:38:21.426749 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" event={"ID":"50a2fa9e-1d17-41de-af81-c06f0afdf170","Type":"ContainerDied","Data":"36de7c19329f7e1e7a3a3d12371935598c6168e867fced7f7f93c0b2af19e222"} Sep 30 05:38:21 crc kubenswrapper[4956]: I0930 05:38:21.426783 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" event={"ID":"50a2fa9e-1d17-41de-af81-c06f0afdf170","Type":"ContainerStarted","Data":"9324c4f904e4719ae73208d4a8de0212a19f1b7364976f89bf1dad2c0322b494"} Sep 30 05:38:23 crc kubenswrapper[4956]: I0930 05:38:23.438880 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" event={"ID":"50a2fa9e-1d17-41de-af81-c06f0afdf170","Type":"ContainerStarted","Data":"8b24c83622dfc7132c508ce1aaee2d84cdb0b479b62e4cc2e478e8d9cf7cd4a3"} Sep 30 05:38:23 crc kubenswrapper[4956]: I0930 05:38:23.523927 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29mm8" Sep 30 05:38:24 crc kubenswrapper[4956]: I0930 05:38:24.449673 4956 generic.go:334] "Generic (PLEG): container finished" podID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerID="8b24c83622dfc7132c508ce1aaee2d84cdb0b479b62e4cc2e478e8d9cf7cd4a3" exitCode=0 Sep 30 05:38:24 crc kubenswrapper[4956]: I0930 05:38:24.449821 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" event={"ID":"50a2fa9e-1d17-41de-af81-c06f0afdf170","Type":"ContainerDied","Data":"8b24c83622dfc7132c508ce1aaee2d84cdb0b479b62e4cc2e478e8d9cf7cd4a3"} Sep 30 05:38:25 crc kubenswrapper[4956]: I0930 05:38:25.462906 4956 generic.go:334] "Generic (PLEG): container finished" podID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerID="7fdfaeb53d02dd9189810e3b0728c8a91b8f6bd1998a5be25634d72a5b0b9a06" exitCode=0 Sep 30 05:38:25 crc kubenswrapper[4956]: I0930 05:38:25.462969 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" event={"ID":"50a2fa9e-1d17-41de-af81-c06f0afdf170","Type":"ContainerDied","Data":"7fdfaeb53d02dd9189810e3b0728c8a91b8f6bd1998a5be25634d72a5b0b9a06"} Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.678494 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.839523 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg6ps\" (UniqueName: \"kubernetes.io/projected/50a2fa9e-1d17-41de-af81-c06f0afdf170-kube-api-access-vg6ps\") pod \"50a2fa9e-1d17-41de-af81-c06f0afdf170\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.839813 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-util\") pod \"50a2fa9e-1d17-41de-af81-c06f0afdf170\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.839882 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-bundle\") pod \"50a2fa9e-1d17-41de-af81-c06f0afdf170\" (UID: \"50a2fa9e-1d17-41de-af81-c06f0afdf170\") " Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.842518 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-bundle" (OuterVolumeSpecName: "bundle") pod "50a2fa9e-1d17-41de-af81-c06f0afdf170" (UID: "50a2fa9e-1d17-41de-af81-c06f0afdf170"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.848368 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a2fa9e-1d17-41de-af81-c06f0afdf170-kube-api-access-vg6ps" (OuterVolumeSpecName: "kube-api-access-vg6ps") pod "50a2fa9e-1d17-41de-af81-c06f0afdf170" (UID: "50a2fa9e-1d17-41de-af81-c06f0afdf170"). InnerVolumeSpecName "kube-api-access-vg6ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.849999 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-util" (OuterVolumeSpecName: "util") pod "50a2fa9e-1d17-41de-af81-c06f0afdf170" (UID: "50a2fa9e-1d17-41de-af81-c06f0afdf170"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.941885 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.941939 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg6ps\" (UniqueName: \"kubernetes.io/projected/50a2fa9e-1d17-41de-af81-c06f0afdf170-kube-api-access-vg6ps\") on node \"crc\" DevicePath \"\"" Sep 30 05:38:26 crc kubenswrapper[4956]: I0930 05:38:26.941962 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50a2fa9e-1d17-41de-af81-c06f0afdf170-util\") on node \"crc\" DevicePath \"\"" Sep 30 05:38:27 crc kubenswrapper[4956]: I0930 05:38:27.479093 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" event={"ID":"50a2fa9e-1d17-41de-af81-c06f0afdf170","Type":"ContainerDied","Data":"9324c4f904e4719ae73208d4a8de0212a19f1b7364976f89bf1dad2c0322b494"} Sep 30 05:38:27 crc kubenswrapper[4956]: I0930 05:38:27.479163 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k" Sep 30 05:38:27 crc kubenswrapper[4956]: I0930 05:38:27.479186 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9324c4f904e4719ae73208d4a8de0212a19f1b7364976f89bf1dad2c0322b494" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.235291 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv"] Sep 30 05:38:37 crc kubenswrapper[4956]: E0930 05:38:37.235876 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerName="util" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.235888 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerName="util" Sep 30 05:38:37 crc kubenswrapper[4956]: E0930 05:38:37.235904 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerName="extract" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.235910 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerName="extract" Sep 30 05:38:37 crc kubenswrapper[4956]: E0930 05:38:37.235918 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerName="pull" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.235925 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerName="pull" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.236014 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a2fa9e-1d17-41de-af81-c06f0afdf170" containerName="extract" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.236378 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.238153 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-zpvl6" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.238829 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.238945 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.248661 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.294766 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.296785 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.299492 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-2fkrv" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.299654 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.314488 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.316071 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.343920 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.346975 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.366515 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0dd88442-b29e-47e9-b221-57ac09bbc7cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5\" (UID: \"0dd88442-b29e-47e9-b221-57ac09bbc7cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.366584 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0dd88442-b29e-47e9-b221-57ac09bbc7cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5\" (UID: \"0dd88442-b29e-47e9-b221-57ac09bbc7cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.366605 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/868bdc73-4a3b-49ec-9676-0d98a950e1ed-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b\" (UID: \"868bdc73-4a3b-49ec-9676-0d98a950e1ed\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.366624 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/868bdc73-4a3b-49ec-9676-0d98a950e1ed-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b\" (UID: \"868bdc73-4a3b-49ec-9676-0d98a950e1ed\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.366672 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w79ws\" (UniqueName: \"kubernetes.io/projected/45c3115a-12d7-4cd7-83a8-f9a720e63ce6-kube-api-access-w79ws\") pod \"obo-prometheus-operator-7c8cf85677-vl7fv\" (UID: \"45c3115a-12d7-4cd7-83a8-f9a720e63ce6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.461272 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-pv5cs"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.461943 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.466303 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-95627" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.466523 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.467494 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfspq\" (UniqueName: \"kubernetes.io/projected/f9203661-7a5b-45cd-9057-78b70739a89b-kube-api-access-lfspq\") pod \"observability-operator-cc5f78dfc-pv5cs\" (UID: \"f9203661-7a5b-45cd-9057-78b70739a89b\") " pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.467588 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w79ws\" (UniqueName: \"kubernetes.io/projected/45c3115a-12d7-4cd7-83a8-f9a720e63ce6-kube-api-access-w79ws\") pod \"obo-prometheus-operator-7c8cf85677-vl7fv\" (UID: \"45c3115a-12d7-4cd7-83a8-f9a720e63ce6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.467641 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9203661-7a5b-45cd-9057-78b70739a89b-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-pv5cs\" (UID: \"f9203661-7a5b-45cd-9057-78b70739a89b\") " pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.467665 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0dd88442-b29e-47e9-b221-57ac09bbc7cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5\" (UID: \"0dd88442-b29e-47e9-b221-57ac09bbc7cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.467716 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0dd88442-b29e-47e9-b221-57ac09bbc7cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5\" (UID: \"0dd88442-b29e-47e9-b221-57ac09bbc7cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.467736 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/868bdc73-4a3b-49ec-9676-0d98a950e1ed-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b\" (UID: \"868bdc73-4a3b-49ec-9676-0d98a950e1ed\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.467759 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/868bdc73-4a3b-49ec-9676-0d98a950e1ed-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b\" (UID: \"868bdc73-4a3b-49ec-9676-0d98a950e1ed\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.474950 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/868bdc73-4a3b-49ec-9676-0d98a950e1ed-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b\" (UID: \"868bdc73-4a3b-49ec-9676-0d98a950e1ed\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.476489 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/868bdc73-4a3b-49ec-9676-0d98a950e1ed-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b\" (UID: \"868bdc73-4a3b-49ec-9676-0d98a950e1ed\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.483519 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0dd88442-b29e-47e9-b221-57ac09bbc7cb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5\" (UID: \"0dd88442-b29e-47e9-b221-57ac09bbc7cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.483974 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-pv5cs"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.484379 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w79ws\" (UniqueName: \"kubernetes.io/projected/45c3115a-12d7-4cd7-83a8-f9a720e63ce6-kube-api-access-w79ws\") pod \"obo-prometheus-operator-7c8cf85677-vl7fv\" (UID: \"45c3115a-12d7-4cd7-83a8-f9a720e63ce6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.489303 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0dd88442-b29e-47e9-b221-57ac09bbc7cb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5\" (UID: \"0dd88442-b29e-47e9-b221-57ac09bbc7cb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.567746 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.568709 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfspq\" (UniqueName: \"kubernetes.io/projected/f9203661-7a5b-45cd-9057-78b70739a89b-kube-api-access-lfspq\") pod \"observability-operator-cc5f78dfc-pv5cs\" (UID: \"f9203661-7a5b-45cd-9057-78b70739a89b\") " pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.568791 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9203661-7a5b-45cd-9057-78b70739a89b-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-pv5cs\" (UID: \"f9203661-7a5b-45cd-9057-78b70739a89b\") " pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.574874 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9203661-7a5b-45cd-9057-78b70739a89b-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-pv5cs\" (UID: \"f9203661-7a5b-45cd-9057-78b70739a89b\") " pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.584616 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfspq\" (UniqueName: \"kubernetes.io/projected/f9203661-7a5b-45cd-9057-78b70739a89b-kube-api-access-lfspq\") pod \"observability-operator-cc5f78dfc-pv5cs\" (UID: \"f9203661-7a5b-45cd-9057-78b70739a89b\") " pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.658933 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.659126 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.664138 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-j6v4z"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.665272 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.670384 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6qdc9" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.670770 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d32f519a-014e-43e2-b715-78e9fd9197c3-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-j6v4z\" (UID: \"d32f519a-014e-43e2-b715-78e9fd9197c3\") " pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.671021 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdgn4\" (UniqueName: \"kubernetes.io/projected/d32f519a-014e-43e2-b715-78e9fd9197c3-kube-api-access-qdgn4\") pod \"perses-operator-54bc95c9fb-j6v4z\" (UID: \"d32f519a-014e-43e2-b715-78e9fd9197c3\") " pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.682201 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-j6v4z"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.771691 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d32f519a-014e-43e2-b715-78e9fd9197c3-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-j6v4z\" (UID: \"d32f519a-014e-43e2-b715-78e9fd9197c3\") " pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.771734 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgn4\" (UniqueName: \"kubernetes.io/projected/d32f519a-014e-43e2-b715-78e9fd9197c3-kube-api-access-qdgn4\") pod \"perses-operator-54bc95c9fb-j6v4z\" (UID: \"d32f519a-014e-43e2-b715-78e9fd9197c3\") " pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.772717 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d32f519a-014e-43e2-b715-78e9fd9197c3-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-j6v4z\" (UID: \"d32f519a-014e-43e2-b715-78e9fd9197c3\") " pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.795779 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdgn4\" (UniqueName: \"kubernetes.io/projected/d32f519a-014e-43e2-b715-78e9fd9197c3-kube-api-access-qdgn4\") pod \"perses-operator-54bc95c9fb-j6v4z\" (UID: \"d32f519a-014e-43e2-b715-78e9fd9197c3\") " pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.803482 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv"] Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.825140 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:37 crc kubenswrapper[4956]: I0930 05:38:37.930805 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b"] Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.015184 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.101238 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-pv5cs"] Sep 30 05:38:38 crc kubenswrapper[4956]: W0930 05:38:38.110257 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9203661_7a5b_45cd_9057_78b70739a89b.slice/crio-ed4a566b1ab0bc329c5c2cbc15b1e69fc2fbc98461877e49f3b96865c7d93867 WatchSource:0}: Error finding container ed4a566b1ab0bc329c5c2cbc15b1e69fc2fbc98461877e49f3b96865c7d93867: Status 404 returned error can't find the container with id ed4a566b1ab0bc329c5c2cbc15b1e69fc2fbc98461877e49f3b96865c7d93867 Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.155416 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5"] Sep 30 05:38:38 crc kubenswrapper[4956]: W0930 05:38:38.165204 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dd88442_b29e_47e9_b221_57ac09bbc7cb.slice/crio-af11e174e4248f1f8062ae3089cf5c4a5d3b6189ff0878820de820d6172b8d9a WatchSource:0}: Error finding container af11e174e4248f1f8062ae3089cf5c4a5d3b6189ff0878820de820d6172b8d9a: Status 404 returned error can't find the container with id af11e174e4248f1f8062ae3089cf5c4a5d3b6189ff0878820de820d6172b8d9a Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.226563 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-j6v4z"] Sep 30 05:38:38 crc kubenswrapper[4956]: W0930 05:38:38.233988 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd32f519a_014e_43e2_b715_78e9fd9197c3.slice/crio-e4e87ba8214bb338359151a9cf2dd5da63c1be9759059e7b994ef77c0d40411d WatchSource:0}: Error finding container e4e87ba8214bb338359151a9cf2dd5da63c1be9759059e7b994ef77c0d40411d: Status 404 returned error can't find the container with id e4e87ba8214bb338359151a9cf2dd5da63c1be9759059e7b994ef77c0d40411d Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.538061 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" event={"ID":"d32f519a-014e-43e2-b715-78e9fd9197c3","Type":"ContainerStarted","Data":"e4e87ba8214bb338359151a9cf2dd5da63c1be9759059e7b994ef77c0d40411d"} Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.539519 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv" event={"ID":"45c3115a-12d7-4cd7-83a8-f9a720e63ce6","Type":"ContainerStarted","Data":"900220f5160411d95bd16cc92679ce1e155df5a1ae3babcbcf12632d9b8a897b"} Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.541040 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" event={"ID":"0dd88442-b29e-47e9-b221-57ac09bbc7cb","Type":"ContainerStarted","Data":"af11e174e4248f1f8062ae3089cf5c4a5d3b6189ff0878820de820d6172b8d9a"} Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.542528 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" event={"ID":"868bdc73-4a3b-49ec-9676-0d98a950e1ed","Type":"ContainerStarted","Data":"562b621bc0cf591bddd4719319ac403dbd5649666166b4fce00ae53419a079ee"} Sep 30 05:38:38 crc kubenswrapper[4956]: I0930 05:38:38.543739 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" event={"ID":"f9203661-7a5b-45cd-9057-78b70739a89b","Type":"ContainerStarted","Data":"ed4a566b1ab0bc329c5c2cbc15b1e69fc2fbc98461877e49f3b96865c7d93867"} Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.532360 4956 scope.go:117] "RemoveContainer" containerID="13a15e9b9cc9074a0dd59fbc43bb447cfe077462e1a7815e4f9b64e6864d6119" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.625858 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" event={"ID":"d32f519a-014e-43e2-b715-78e9fd9197c3","Type":"ContainerStarted","Data":"1966cbeb590e83f05a886a1ac7b530cbbda1ae6643aa5811bdfa6fff8f8a76ec"} Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.625961 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.627668 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv" event={"ID":"45c3115a-12d7-4cd7-83a8-f9a720e63ce6","Type":"ContainerStarted","Data":"a186dbec6cac5cf34d4e26ecd1e6feb616e94146225a2a9128d5cc6610e62c50"} Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.629425 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-frfx9_72ad9902-843c-4117-9ac1-c34d525c9d55/kube-multus/2.log" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.634728 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" event={"ID":"0dd88442-b29e-47e9-b221-57ac09bbc7cb","Type":"ContainerStarted","Data":"ef21e51b6fc1b5456eeecca9dc05f871fac58431154e0bb84af3eedf911c3d06"} Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.636421 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" event={"ID":"868bdc73-4a3b-49ec-9676-0d98a950e1ed","Type":"ContainerStarted","Data":"901cf37cf256a2f36052f0581b49ffad3a174fbc4ac9fc29f3a356c61cdc3aa1"} Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.638147 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" event={"ID":"f9203661-7a5b-45cd-9057-78b70739a89b","Type":"ContainerStarted","Data":"a8b1ecdfd1442006df44fa8c0a563c38f7626e78c75586da7e744ccbce5fe563"} Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.638355 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.660197 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" podStartSLOduration=2.105477517 podStartE2EDuration="13.660181752s" podCreationTimestamp="2025-09-30 05:38:37 +0000 UTC" firstStartedPulling="2025-09-30 05:38:38.238103431 +0000 UTC m=+588.565223946" lastFinishedPulling="2025-09-30 05:38:49.792807666 +0000 UTC m=+600.119928181" observedRunningTime="2025-09-30 05:38:50.658230449 +0000 UTC m=+600.985350974" watchObservedRunningTime="2025-09-30 05:38:50.660181752 +0000 UTC m=+600.987302277" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.676367 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.685979 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vl7fv" podStartSLOduration=1.7321783819999998 podStartE2EDuration="13.685964039s" podCreationTimestamp="2025-09-30 05:38:37 +0000 UTC" firstStartedPulling="2025-09-30 05:38:37.813029575 +0000 UTC m=+588.140150100" lastFinishedPulling="2025-09-30 05:38:49.766815232 +0000 UTC m=+600.093935757" observedRunningTime="2025-09-30 05:38:50.683664095 +0000 UTC m=+601.010784620" watchObservedRunningTime="2025-09-30 05:38:50.685964039 +0000 UTC m=+601.013084574" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.716077 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5" podStartSLOduration=2.122610776 podStartE2EDuration="13.716056315s" podCreationTimestamp="2025-09-30 05:38:37 +0000 UTC" firstStartedPulling="2025-09-30 05:38:38.172909718 +0000 UTC m=+588.500030243" lastFinishedPulling="2025-09-30 05:38:49.766355257 +0000 UTC m=+600.093475782" observedRunningTime="2025-09-30 05:38:50.714376152 +0000 UTC m=+601.041496697" watchObservedRunningTime="2025-09-30 05:38:50.716056315 +0000 UTC m=+601.043176841" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.743187 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b" podStartSLOduration=1.901491279 podStartE2EDuration="13.743173246s" podCreationTimestamp="2025-09-30 05:38:37 +0000 UTC" firstStartedPulling="2025-09-30 05:38:37.955000683 +0000 UTC m=+588.282121208" lastFinishedPulling="2025-09-30 05:38:49.79668265 +0000 UTC m=+600.123803175" observedRunningTime="2025-09-30 05:38:50.739982594 +0000 UTC m=+601.067103139" watchObservedRunningTime="2025-09-30 05:38:50.743173246 +0000 UTC m=+601.070293771" Sep 30 05:38:50 crc kubenswrapper[4956]: I0930 05:38:50.769324 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-pv5cs" podStartSLOduration=2.118609408 podStartE2EDuration="13.769303855s" podCreationTimestamp="2025-09-30 05:38:37 +0000 UTC" firstStartedPulling="2025-09-30 05:38:38.11564307 +0000 UTC m=+588.442763595" lastFinishedPulling="2025-09-30 05:38:49.766337517 +0000 UTC m=+600.093458042" observedRunningTime="2025-09-30 05:38:50.768103117 +0000 UTC m=+601.095223642" watchObservedRunningTime="2025-09-30 05:38:50.769303855 +0000 UTC m=+601.096424380" Sep 30 05:38:58 crc kubenswrapper[4956]: I0930 05:38:58.018932 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-j6v4z" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.425739 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz"] Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.428127 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.436581 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.442628 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz"] Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.555389 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.555443 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gd5f\" (UniqueName: \"kubernetes.io/projected/7dd636e2-36d8-47b0-a01c-26852a43d3b3-kube-api-access-2gd5f\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.555470 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.656837 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.657092 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gd5f\" (UniqueName: \"kubernetes.io/projected/7dd636e2-36d8-47b0-a01c-26852a43d3b3-kube-api-access-2gd5f\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.657191 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.657672 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.657833 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.674254 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gd5f\" (UniqueName: \"kubernetes.io/projected/7dd636e2-36d8-47b0-a01c-26852a43d3b3-kube-api-access-2gd5f\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:15 crc kubenswrapper[4956]: I0930 05:39:15.744669 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:16 crc kubenswrapper[4956]: I0930 05:39:16.137137 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz"] Sep 30 05:39:16 crc kubenswrapper[4956]: I0930 05:39:16.785952 4956 generic.go:334] "Generic (PLEG): container finished" podID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerID="4b9028f2099c493dcc8db3995557e4f57ccc0382405597be73fa5679830704e3" exitCode=0 Sep 30 05:39:16 crc kubenswrapper[4956]: I0930 05:39:16.786020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" event={"ID":"7dd636e2-36d8-47b0-a01c-26852a43d3b3","Type":"ContainerDied","Data":"4b9028f2099c493dcc8db3995557e4f57ccc0382405597be73fa5679830704e3"} Sep 30 05:39:16 crc kubenswrapper[4956]: I0930 05:39:16.786214 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" event={"ID":"7dd636e2-36d8-47b0-a01c-26852a43d3b3","Type":"ContainerStarted","Data":"db178393adcb672216a73261188e7f8291875b553fd37a73520cb54bc83adb3b"} Sep 30 05:39:18 crc kubenswrapper[4956]: I0930 05:39:18.803616 4956 generic.go:334] "Generic (PLEG): container finished" podID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerID="0213798069b87f49da7362a79384b85d0eba87549ccc682b681681c06c3f6e70" exitCode=0 Sep 30 05:39:18 crc kubenswrapper[4956]: I0930 05:39:18.803723 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" event={"ID":"7dd636e2-36d8-47b0-a01c-26852a43d3b3","Type":"ContainerDied","Data":"0213798069b87f49da7362a79384b85d0eba87549ccc682b681681c06c3f6e70"} Sep 30 05:39:19 crc kubenswrapper[4956]: I0930 05:39:19.812655 4956 generic.go:334] "Generic (PLEG): container finished" podID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerID="7fd7622bb405742aaed563cd6de11748d2bafc758b648a8ca41a0664c64f9c6e" exitCode=0 Sep 30 05:39:19 crc kubenswrapper[4956]: I0930 05:39:19.812724 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" event={"ID":"7dd636e2-36d8-47b0-a01c-26852a43d3b3","Type":"ContainerDied","Data":"7fd7622bb405742aaed563cd6de11748d2bafc758b648a8ca41a0664c64f9c6e"} Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.029019 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.126665 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-bundle\") pod \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.126788 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-util\") pod \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.126858 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gd5f\" (UniqueName: \"kubernetes.io/projected/7dd636e2-36d8-47b0-a01c-26852a43d3b3-kube-api-access-2gd5f\") pod \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\" (UID: \"7dd636e2-36d8-47b0-a01c-26852a43d3b3\") " Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.127371 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-bundle" (OuterVolumeSpecName: "bundle") pod "7dd636e2-36d8-47b0-a01c-26852a43d3b3" (UID: "7dd636e2-36d8-47b0-a01c-26852a43d3b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.132039 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd636e2-36d8-47b0-a01c-26852a43d3b3-kube-api-access-2gd5f" (OuterVolumeSpecName: "kube-api-access-2gd5f") pod "7dd636e2-36d8-47b0-a01c-26852a43d3b3" (UID: "7dd636e2-36d8-47b0-a01c-26852a43d3b3"). InnerVolumeSpecName "kube-api-access-2gd5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.140411 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-util" (OuterVolumeSpecName: "util") pod "7dd636e2-36d8-47b0-a01c-26852a43d3b3" (UID: "7dd636e2-36d8-47b0-a01c-26852a43d3b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.228170 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gd5f\" (UniqueName: \"kubernetes.io/projected/7dd636e2-36d8-47b0-a01c-26852a43d3b3-kube-api-access-2gd5f\") on node \"crc\" DevicePath \"\"" Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.228204 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.228213 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dd636e2-36d8-47b0-a01c-26852a43d3b3-util\") on node \"crc\" DevicePath \"\"" Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.829652 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" event={"ID":"7dd636e2-36d8-47b0-a01c-26852a43d3b3","Type":"ContainerDied","Data":"db178393adcb672216a73261188e7f8291875b553fd37a73520cb54bc83adb3b"} Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.829704 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db178393adcb672216a73261188e7f8291875b553fd37a73520cb54bc83adb3b" Sep 30 05:39:21 crc kubenswrapper[4956]: I0930 05:39:21.829802 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.892939 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp"] Sep 30 05:39:24 crc kubenswrapper[4956]: E0930 05:39:24.893421 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerName="extract" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.893435 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerName="extract" Sep 30 05:39:24 crc kubenswrapper[4956]: E0930 05:39:24.893452 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerName="pull" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.893459 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerName="pull" Sep 30 05:39:24 crc kubenswrapper[4956]: E0930 05:39:24.893477 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerName="util" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.893483 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerName="util" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.893574 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd636e2-36d8-47b0-a01c-26852a43d3b3" containerName="extract" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.893936 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.895446 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.896645 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-nddv9" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.898295 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.909635 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp"] Sep 30 05:39:24 crc kubenswrapper[4956]: I0930 05:39:24.969671 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dghv\" (UniqueName: \"kubernetes.io/projected/34203d19-cae9-4ef7-863e-03f524e1a662-kube-api-access-6dghv\") pod \"nmstate-operator-5d6f6cfd66-txhvp\" (UID: \"34203d19-cae9-4ef7-863e-03f524e1a662\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp" Sep 30 05:39:25 crc kubenswrapper[4956]: I0930 05:39:25.091220 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dghv\" (UniqueName: \"kubernetes.io/projected/34203d19-cae9-4ef7-863e-03f524e1a662-kube-api-access-6dghv\") pod \"nmstate-operator-5d6f6cfd66-txhvp\" (UID: \"34203d19-cae9-4ef7-863e-03f524e1a662\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp" Sep 30 05:39:25 crc kubenswrapper[4956]: I0930 05:39:25.107379 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dghv\" (UniqueName: \"kubernetes.io/projected/34203d19-cae9-4ef7-863e-03f524e1a662-kube-api-access-6dghv\") pod \"nmstate-operator-5d6f6cfd66-txhvp\" (UID: \"34203d19-cae9-4ef7-863e-03f524e1a662\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp" Sep 30 05:39:25 crc kubenswrapper[4956]: I0930 05:39:25.209282 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp" Sep 30 05:39:25 crc kubenswrapper[4956]: I0930 05:39:25.572620 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp"] Sep 30 05:39:25 crc kubenswrapper[4956]: W0930 05:39:25.579699 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34203d19_cae9_4ef7_863e_03f524e1a662.slice/crio-f0208d9f222957f57ad4da97019cf545f1d5b60ae537eaa27144779a09336fe0 WatchSource:0}: Error finding container f0208d9f222957f57ad4da97019cf545f1d5b60ae537eaa27144779a09336fe0: Status 404 returned error can't find the container with id f0208d9f222957f57ad4da97019cf545f1d5b60ae537eaa27144779a09336fe0 Sep 30 05:39:25 crc kubenswrapper[4956]: I0930 05:39:25.851584 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp" event={"ID":"34203d19-cae9-4ef7-863e-03f524e1a662","Type":"ContainerStarted","Data":"f0208d9f222957f57ad4da97019cf545f1d5b60ae537eaa27144779a09336fe0"} Sep 30 05:39:28 crc kubenswrapper[4956]: I0930 05:39:28.868908 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp" event={"ID":"34203d19-cae9-4ef7-863e-03f524e1a662","Type":"ContainerStarted","Data":"8dd5fc15172acbbcd5ea61e04c2b75d8e12c0200807f2a883dd3edd49c62ae47"} Sep 30 05:39:28 crc kubenswrapper[4956]: I0930 05:39:28.890618 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-txhvp" podStartSLOduration=2.717380189 podStartE2EDuration="4.890603146s" podCreationTimestamp="2025-09-30 05:39:24 +0000 UTC" firstStartedPulling="2025-09-30 05:39:25.581451991 +0000 UTC m=+635.908572516" lastFinishedPulling="2025-09-30 05:39:27.754674948 +0000 UTC m=+638.081795473" observedRunningTime="2025-09-30 05:39:28.884241021 +0000 UTC m=+639.211361546" watchObservedRunningTime="2025-09-30 05:39:28.890603146 +0000 UTC m=+639.217723671" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.837397 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d"] Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.839316 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.841105 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-n7q4c" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.846453 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d"] Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.862234 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-dxkwj"] Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.863196 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.865091 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt"] Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.865739 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.880360 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.912290 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt"] Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.952877 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-dbus-socket\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.952951 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-vsdnt\" (UID: \"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.952982 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwm2z\" (UniqueName: \"kubernetes.io/projected/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-kube-api-access-lwm2z\") pod \"nmstate-webhook-6d689559c5-vsdnt\" (UID: \"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.953016 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-nmstate-lock\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.953042 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6v5b\" (UniqueName: \"kubernetes.io/projected/5c73bb61-b711-4efb-8ba7-de118a9b30e7-kube-api-access-w6v5b\") pod \"nmstate-metrics-58fcddf996-cfj2d\" (UID: \"5c73bb61-b711-4efb-8ba7-de118a9b30e7\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.953060 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-ovs-socket\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:29 crc kubenswrapper[4956]: I0930 05:39:29.953100 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8g5f\" (UniqueName: \"kubernetes.io/projected/1f61456f-3dcb-4831-9760-c06143ec9b14-kube-api-access-q8g5f\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.053938 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8g5f\" (UniqueName: \"kubernetes.io/projected/1f61456f-3dcb-4831-9760-c06143ec9b14-kube-api-access-q8g5f\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054035 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-dbus-socket\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054073 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-vsdnt\" (UID: \"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054166 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwm2z\" (UniqueName: \"kubernetes.io/projected/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-kube-api-access-lwm2z\") pod \"nmstate-webhook-6d689559c5-vsdnt\" (UID: \"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054199 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-nmstate-lock\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: E0930 05:39:30.054230 4956 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 05:39:30 crc kubenswrapper[4956]: E0930 05:39:30.054290 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-tls-key-pair podName:8afb5dc0-a1aa-4c21-95b1-62c64b452ff1 nodeName:}" failed. No retries permitted until 2025-09-30 05:39:30.554275024 +0000 UTC m=+640.881395549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-tls-key-pair") pod "nmstate-webhook-6d689559c5-vsdnt" (UID: "8afb5dc0-a1aa-4c21-95b1-62c64b452ff1") : secret "openshift-nmstate-webhook" not found Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054327 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-nmstate-lock\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054373 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-dbus-socket\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054235 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6v5b\" (UniqueName: \"kubernetes.io/projected/5c73bb61-b711-4efb-8ba7-de118a9b30e7-kube-api-access-w6v5b\") pod \"nmstate-metrics-58fcddf996-cfj2d\" (UID: \"5c73bb61-b711-4efb-8ba7-de118a9b30e7\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054459 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-ovs-socket\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.054553 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1f61456f-3dcb-4831-9760-c06143ec9b14-ovs-socket\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.056964 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v"] Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.057844 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.059402 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-7hhj4" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.059767 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.059773 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.093362 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwm2z\" (UniqueName: \"kubernetes.io/projected/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-kube-api-access-lwm2z\") pod \"nmstate-webhook-6d689559c5-vsdnt\" (UID: \"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.096602 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6v5b\" (UniqueName: \"kubernetes.io/projected/5c73bb61-b711-4efb-8ba7-de118a9b30e7-kube-api-access-w6v5b\") pod \"nmstate-metrics-58fcddf996-cfj2d\" (UID: \"5c73bb61-b711-4efb-8ba7-de118a9b30e7\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.101352 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8g5f\" (UniqueName: \"kubernetes.io/projected/1f61456f-3dcb-4831-9760-c06143ec9b14-kube-api-access-q8g5f\") pod \"nmstate-handler-dxkwj\" (UID: \"1f61456f-3dcb-4831-9760-c06143ec9b14\") " pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.118577 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v"] Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.155636 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.156078 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.156244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.156280 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzvms\" (UniqueName: \"kubernetes.io/projected/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-kube-api-access-vzvms\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.181493 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.256992 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.257091 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.257135 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzvms\" (UniqueName: \"kubernetes.io/projected/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-kube-api-access-vzvms\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: E0930 05:39:30.257493 4956 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 30 05:39:30 crc kubenswrapper[4956]: E0930 05:39:30.257538 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-plugin-serving-cert podName:7a6e9183-4fbf-4549-926d-d8a48c0d17ac nodeName:}" failed. No retries permitted until 2025-09-30 05:39:30.757524288 +0000 UTC m=+641.084644813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-svf7v" (UID: "7a6e9183-4fbf-4549-926d-d8a48c0d17ac") : secret "plugin-serving-cert" not found Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.257839 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.287042 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-55f85fd578-87x8n"] Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.287764 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.293831 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzvms\" (UniqueName: \"kubernetes.io/projected/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-kube-api-access-vzvms\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.300926 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f85fd578-87x8n"] Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.404921 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d"] Sep 30 05:39:30 crc kubenswrapper[4956]: W0930 05:39:30.413333 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c73bb61_b711_4efb_8ba7_de118a9b30e7.slice/crio-11dcf91ffcb46ac18a8a239d5c639c2d3625186ab3a1f7c11912b0c33d044887 WatchSource:0}: Error finding container 11dcf91ffcb46ac18a8a239d5c639c2d3625186ab3a1f7c11912b0c33d044887: Status 404 returned error can't find the container with id 11dcf91ffcb46ac18a8a239d5c639c2d3625186ab3a1f7c11912b0c33d044887 Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.460979 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-trusted-ca-bundle\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.461025 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-oauth-serving-cert\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.461051 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-serving-cert\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.461334 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-oauth-config\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.461614 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-config\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.461674 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj9zv\" (UniqueName: \"kubernetes.io/projected/3a481fd1-6802-42ac-ae09-53bcfa42804e-kube-api-access-mj9zv\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.461722 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-service-ca\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.562804 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-oauth-serving-cert\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.562854 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-serving-cert\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.562880 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-oauth-config\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.562925 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-vsdnt\" (UID: \"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.562961 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-config\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.562981 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj9zv\" (UniqueName: \"kubernetes.io/projected/3a481fd1-6802-42ac-ae09-53bcfa42804e-kube-api-access-mj9zv\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.563008 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-service-ca\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.563038 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-trusted-ca-bundle\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.564008 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-oauth-serving-cert\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.564473 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-config\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.565715 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-trusted-ca-bundle\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.565729 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a481fd1-6802-42ac-ae09-53bcfa42804e-service-ca\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.568186 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8afb5dc0-a1aa-4c21-95b1-62c64b452ff1-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-vsdnt\" (UID: \"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.568198 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-serving-cert\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.568716 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a481fd1-6802-42ac-ae09-53bcfa42804e-console-oauth-config\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.580672 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj9zv\" (UniqueName: \"kubernetes.io/projected/3a481fd1-6802-42ac-ae09-53bcfa42804e-kube-api-access-mj9zv\") pod \"console-55f85fd578-87x8n\" (UID: \"3a481fd1-6802-42ac-ae09-53bcfa42804e\") " pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.614947 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.764974 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.768639 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a6e9183-4fbf-4549-926d-d8a48c0d17ac-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-svf7v\" (UID: \"7a6e9183-4fbf-4549-926d-d8a48c0d17ac\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.789087 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.889392 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dxkwj" event={"ID":"1f61456f-3dcb-4831-9760-c06143ec9b14","Type":"ContainerStarted","Data":"7cdadf0e0062218eda2ae67a1b3986dbc9b0921d89f04bc0a9cce0ab43d4afd2"} Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.890818 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" event={"ID":"5c73bb61-b711-4efb-8ba7-de118a9b30e7","Type":"ContainerStarted","Data":"11dcf91ffcb46ac18a8a239d5c639c2d3625186ab3a1f7c11912b0c33d044887"} Sep 30 05:39:30 crc kubenswrapper[4956]: I0930 05:39:30.994080 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" Sep 30 05:39:31 crc kubenswrapper[4956]: I0930 05:39:31.020878 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f85fd578-87x8n"] Sep 30 05:39:31 crc kubenswrapper[4956]: W0930 05:39:31.026230 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a481fd1_6802_42ac_ae09_53bcfa42804e.slice/crio-5b6522a9541cef8bbd81ddfca062955656b58a9e76d3fb8e2f41ebb649185244 WatchSource:0}: Error finding container 5b6522a9541cef8bbd81ddfca062955656b58a9e76d3fb8e2f41ebb649185244: Status 404 returned error can't find the container with id 5b6522a9541cef8bbd81ddfca062955656b58a9e76d3fb8e2f41ebb649185244 Sep 30 05:39:31 crc kubenswrapper[4956]: I0930 05:39:31.204052 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt"] Sep 30 05:39:31 crc kubenswrapper[4956]: I0930 05:39:31.207847 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v"] Sep 30 05:39:31 crc kubenswrapper[4956]: W0930 05:39:31.210412 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6e9183_4fbf_4549_926d_d8a48c0d17ac.slice/crio-219a6b1a60a54fd6c1ffc991d572cb9dd7cbae0286a9f99fb1ccedc72744e354 WatchSource:0}: Error finding container 219a6b1a60a54fd6c1ffc991d572cb9dd7cbae0286a9f99fb1ccedc72744e354: Status 404 returned error can't find the container with id 219a6b1a60a54fd6c1ffc991d572cb9dd7cbae0286a9f99fb1ccedc72744e354 Sep 30 05:39:31 crc kubenswrapper[4956]: W0930 05:39:31.213267 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8afb5dc0_a1aa_4c21_95b1_62c64b452ff1.slice/crio-325b50ab5b2dbeece8511b05b1f225eeef7250755da11bfb6bed871ef6e1c13c WatchSource:0}: Error finding container 325b50ab5b2dbeece8511b05b1f225eeef7250755da11bfb6bed871ef6e1c13c: Status 404 returned error can't find the container with id 325b50ab5b2dbeece8511b05b1f225eeef7250755da11bfb6bed871ef6e1c13c Sep 30 05:39:31 crc kubenswrapper[4956]: I0930 05:39:31.900845 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f85fd578-87x8n" event={"ID":"3a481fd1-6802-42ac-ae09-53bcfa42804e","Type":"ContainerStarted","Data":"d02149b3a5b5bf01c7b3c967102245d9a03b8c54a155feac3bb5508048e2023c"} Sep 30 05:39:31 crc kubenswrapper[4956]: I0930 05:39:31.901248 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f85fd578-87x8n" event={"ID":"3a481fd1-6802-42ac-ae09-53bcfa42804e","Type":"ContainerStarted","Data":"5b6522a9541cef8bbd81ddfca062955656b58a9e76d3fb8e2f41ebb649185244"} Sep 30 05:39:31 crc kubenswrapper[4956]: I0930 05:39:31.902898 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" event={"ID":"7a6e9183-4fbf-4549-926d-d8a48c0d17ac","Type":"ContainerStarted","Data":"219a6b1a60a54fd6c1ffc991d572cb9dd7cbae0286a9f99fb1ccedc72744e354"} Sep 30 05:39:31 crc kubenswrapper[4956]: I0930 05:39:31.905004 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" event={"ID":"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1","Type":"ContainerStarted","Data":"325b50ab5b2dbeece8511b05b1f225eeef7250755da11bfb6bed871ef6e1c13c"} Sep 30 05:39:31 crc kubenswrapper[4956]: I0930 05:39:31.921997 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55f85fd578-87x8n" podStartSLOduration=1.9219776629999998 podStartE2EDuration="1.921977663s" podCreationTimestamp="2025-09-30 05:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:39:31.916227469 +0000 UTC m=+642.243348004" watchObservedRunningTime="2025-09-30 05:39:31.921977663 +0000 UTC m=+642.249098188" Sep 30 05:39:32 crc kubenswrapper[4956]: I0930 05:39:32.916987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dxkwj" event={"ID":"1f61456f-3dcb-4831-9760-c06143ec9b14","Type":"ContainerStarted","Data":"beb335a3a033a379c06844f29ac5d7bb7867dec483b4185a48ce42695af6375f"} Sep 30 05:39:32 crc kubenswrapper[4956]: I0930 05:39:32.917982 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:32 crc kubenswrapper[4956]: I0930 05:39:32.920428 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" event={"ID":"8afb5dc0-a1aa-4c21-95b1-62c64b452ff1","Type":"ContainerStarted","Data":"72c57340c35f5dccf24649fb42834e37f4ec2be9ce2c409b01861477f991aebb"} Sep 30 05:39:32 crc kubenswrapper[4956]: I0930 05:39:32.921238 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:39:32 crc kubenswrapper[4956]: I0930 05:39:32.922823 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" event={"ID":"5c73bb61-b711-4efb-8ba7-de118a9b30e7","Type":"ContainerStarted","Data":"d0a7e85f4671c2efa32bd7bb52d762d09b9298738eb4385959d2aeac2d501bc1"} Sep 30 05:39:32 crc kubenswrapper[4956]: I0930 05:39:32.932597 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-dxkwj" podStartSLOduration=1.613486746 podStartE2EDuration="3.932578867s" podCreationTimestamp="2025-09-30 05:39:29 +0000 UTC" firstStartedPulling="2025-09-30 05:39:30.214322872 +0000 UTC m=+640.541443397" lastFinishedPulling="2025-09-30 05:39:32.533414993 +0000 UTC m=+642.860535518" observedRunningTime="2025-09-30 05:39:32.931878745 +0000 UTC m=+643.258999280" watchObservedRunningTime="2025-09-30 05:39:32.932578867 +0000 UTC m=+643.259699402" Sep 30 05:39:32 crc kubenswrapper[4956]: I0930 05:39:32.947827 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" podStartSLOduration=2.630168185 podStartE2EDuration="3.947802615s" podCreationTimestamp="2025-09-30 05:39:29 +0000 UTC" firstStartedPulling="2025-09-30 05:39:31.215354409 +0000 UTC m=+641.542474934" lastFinishedPulling="2025-09-30 05:39:32.532988839 +0000 UTC m=+642.860109364" observedRunningTime="2025-09-30 05:39:32.945502952 +0000 UTC m=+643.272623477" watchObservedRunningTime="2025-09-30 05:39:32.947802615 +0000 UTC m=+643.274923140" Sep 30 05:39:33 crc kubenswrapper[4956]: I0930 05:39:33.933635 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" event={"ID":"7a6e9183-4fbf-4549-926d-d8a48c0d17ac","Type":"ContainerStarted","Data":"d47ef905db908d1b46b64fb8d10e5059b330d7653f3112a0106b7b4fbf1bbb8f"} Sep 30 05:39:33 crc kubenswrapper[4956]: I0930 05:39:33.954710 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svf7v" podStartSLOduration=1.6579136349999999 podStartE2EDuration="3.95469045s" podCreationTimestamp="2025-09-30 05:39:30 +0000 UTC" firstStartedPulling="2025-09-30 05:39:31.213263021 +0000 UTC m=+641.540383546" lastFinishedPulling="2025-09-30 05:39:33.510039836 +0000 UTC m=+643.837160361" observedRunningTime="2025-09-30 05:39:33.953788491 +0000 UTC m=+644.280909056" watchObservedRunningTime="2025-09-30 05:39:33.95469045 +0000 UTC m=+644.281810985" Sep 30 05:39:34 crc kubenswrapper[4956]: I0930 05:39:34.940814 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" event={"ID":"5c73bb61-b711-4efb-8ba7-de118a9b30e7","Type":"ContainerStarted","Data":"46766a039b19a3bb5beaadea02e693efb42d0ed67143be0b09e9dfb49504b9e8"} Sep 30 05:39:34 crc kubenswrapper[4956]: I0930 05:39:34.962616 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-cfj2d" podStartSLOduration=1.630574604 podStartE2EDuration="5.962599567s" podCreationTimestamp="2025-09-30 05:39:29 +0000 UTC" firstStartedPulling="2025-09-30 05:39:30.416800622 +0000 UTC m=+640.743921147" lastFinishedPulling="2025-09-30 05:39:34.748825555 +0000 UTC m=+645.075946110" observedRunningTime="2025-09-30 05:39:34.961691308 +0000 UTC m=+645.288811883" watchObservedRunningTime="2025-09-30 05:39:34.962599567 +0000 UTC m=+645.289720092" Sep 30 05:39:40 crc kubenswrapper[4956]: I0930 05:39:40.219080 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-dxkwj" Sep 30 05:39:40 crc kubenswrapper[4956]: I0930 05:39:40.615478 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:40 crc kubenswrapper[4956]: I0930 05:39:40.615532 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:40 crc kubenswrapper[4956]: I0930 05:39:40.620861 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:40 crc kubenswrapper[4956]: I0930 05:39:40.984253 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55f85fd578-87x8n" Sep 30 05:39:41 crc kubenswrapper[4956]: I0930 05:39:41.051836 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rtd48"] Sep 30 05:39:50 crc kubenswrapper[4956]: I0930 05:39:50.793873 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-vsdnt" Sep 30 05:40:03 crc kubenswrapper[4956]: I0930 05:40:03.763802 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh"] Sep 30 05:40:03 crc kubenswrapper[4956]: I0930 05:40:03.765569 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:03 crc kubenswrapper[4956]: I0930 05:40:03.773929 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh"] Sep 30 05:40:03 crc kubenswrapper[4956]: I0930 05:40:03.776092 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 05:40:03 crc kubenswrapper[4956]: I0930 05:40:03.917835 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:03 crc kubenswrapper[4956]: I0930 05:40:03.917905 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:03 crc kubenswrapper[4956]: I0930 05:40:03.917972 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdt5d\" (UniqueName: \"kubernetes.io/projected/5435ca6f-5e89-4884-ab53-526b66bf688e-kube-api-access-fdt5d\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:04 crc kubenswrapper[4956]: I0930 05:40:04.019371 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:04 crc kubenswrapper[4956]: I0930 05:40:04.019430 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:04 crc kubenswrapper[4956]: I0930 05:40:04.019493 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdt5d\" (UniqueName: \"kubernetes.io/projected/5435ca6f-5e89-4884-ab53-526b66bf688e-kube-api-access-fdt5d\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:04 crc kubenswrapper[4956]: I0930 05:40:04.020006 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:04 crc kubenswrapper[4956]: I0930 05:40:04.020071 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:04 crc kubenswrapper[4956]: I0930 05:40:04.045439 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdt5d\" (UniqueName: \"kubernetes.io/projected/5435ca6f-5e89-4884-ab53-526b66bf688e-kube-api-access-fdt5d\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:04 crc kubenswrapper[4956]: I0930 05:40:04.081536 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:04 crc kubenswrapper[4956]: I0930 05:40:04.298024 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh"] Sep 30 05:40:05 crc kubenswrapper[4956]: I0930 05:40:05.135595 4956 generic.go:334] "Generic (PLEG): container finished" podID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerID="d054d999615ef23391d95e67b80b3c98c1b3d4f9b18f3c8b714b850244c3ea59" exitCode=0 Sep 30 05:40:05 crc kubenswrapper[4956]: I0930 05:40:05.136134 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" event={"ID":"5435ca6f-5e89-4884-ab53-526b66bf688e","Type":"ContainerDied","Data":"d054d999615ef23391d95e67b80b3c98c1b3d4f9b18f3c8b714b850244c3ea59"} Sep 30 05:40:05 crc kubenswrapper[4956]: I0930 05:40:05.137412 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" event={"ID":"5435ca6f-5e89-4884-ab53-526b66bf688e","Type":"ContainerStarted","Data":"a24da6d15a6f17441e4c01673dbbe818cd8dd4a017b12f0c64e1a786f780174c"} Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.090930 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rtd48" podUID="8f523f25-bc41-44dc-b311-bf6df1cbc2ee" containerName="console" containerID="cri-o://14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7" gracePeriod=15 Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.518103 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rtd48_8f523f25-bc41-44dc-b311-bf6df1cbc2ee/console/0.log" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.518384 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.660302 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-oauth-serving-cert\") pod \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.660347 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-oauth-config\") pod \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.660378 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8sd5\" (UniqueName: \"kubernetes.io/projected/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-kube-api-access-w8sd5\") pod \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.660444 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-service-ca\") pod \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.660480 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-serving-cert\") pod \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.660502 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-config\") pod \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.660520 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-trusted-ca-bundle\") pod \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\" (UID: \"8f523f25-bc41-44dc-b311-bf6df1cbc2ee\") " Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.661223 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8f523f25-bc41-44dc-b311-bf6df1cbc2ee" (UID: "8f523f25-bc41-44dc-b311-bf6df1cbc2ee"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.661257 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8f523f25-bc41-44dc-b311-bf6df1cbc2ee" (UID: "8f523f25-bc41-44dc-b311-bf6df1cbc2ee"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.661683 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-service-ca" (OuterVolumeSpecName: "service-ca") pod "8f523f25-bc41-44dc-b311-bf6df1cbc2ee" (UID: "8f523f25-bc41-44dc-b311-bf6df1cbc2ee"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.662051 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-config" (OuterVolumeSpecName: "console-config") pod "8f523f25-bc41-44dc-b311-bf6df1cbc2ee" (UID: "8f523f25-bc41-44dc-b311-bf6df1cbc2ee"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.666611 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8f523f25-bc41-44dc-b311-bf6df1cbc2ee" (UID: "8f523f25-bc41-44dc-b311-bf6df1cbc2ee"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.667534 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-kube-api-access-w8sd5" (OuterVolumeSpecName: "kube-api-access-w8sd5") pod "8f523f25-bc41-44dc-b311-bf6df1cbc2ee" (UID: "8f523f25-bc41-44dc-b311-bf6df1cbc2ee"). InnerVolumeSpecName "kube-api-access-w8sd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.667740 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8f523f25-bc41-44dc-b311-bf6df1cbc2ee" (UID: "8f523f25-bc41-44dc-b311-bf6df1cbc2ee"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.761782 4956 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.761835 4956 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.761849 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8sd5\" (UniqueName: \"kubernetes.io/projected/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-kube-api-access-w8sd5\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.761864 4956 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.761876 4956 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.761888 4956 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:06 crc kubenswrapper[4956]: I0930 05:40:06.761899 4956 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f523f25-bc41-44dc-b311-bf6df1cbc2ee-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.173353 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rtd48_8f523f25-bc41-44dc-b311-bf6df1cbc2ee/console/0.log" Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.173402 4956 generic.go:334] "Generic (PLEG): container finished" podID="8f523f25-bc41-44dc-b311-bf6df1cbc2ee" containerID="14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7" exitCode=2 Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.173497 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rtd48" Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.173522 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rtd48" event={"ID":"8f523f25-bc41-44dc-b311-bf6df1cbc2ee","Type":"ContainerDied","Data":"14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7"} Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.173557 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rtd48" event={"ID":"8f523f25-bc41-44dc-b311-bf6df1cbc2ee","Type":"ContainerDied","Data":"b0596f4275d84fec44630e6175cfaad09c7d7dd80db8671b890dd42a0d3add18"} Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.173591 4956 scope.go:117] "RemoveContainer" containerID="14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7" Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.177956 4956 generic.go:334] "Generic (PLEG): container finished" podID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerID="31c3c8f42dd54ac03defa8dc669f656cf17fe3fd17d1eb499fe082f5ba825659" exitCode=0 Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.178002 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" event={"ID":"5435ca6f-5e89-4884-ab53-526b66bf688e","Type":"ContainerDied","Data":"31c3c8f42dd54ac03defa8dc669f656cf17fe3fd17d1eb499fe082f5ba825659"} Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.199251 4956 scope.go:117] "RemoveContainer" containerID="14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7" Sep 30 05:40:07 crc kubenswrapper[4956]: E0930 05:40:07.200537 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7\": container with ID starting with 14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7 not found: ID does not exist" containerID="14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7" Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.200574 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7"} err="failed to get container status \"14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7\": rpc error: code = NotFound desc = could not find container \"14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7\": container with ID starting with 14771374a27f2b5733872575eab229d5b9496b969249e62685a13e6802b624e7 not found: ID does not exist" Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.213951 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rtd48"] Sep 30 05:40:07 crc kubenswrapper[4956]: I0930 05:40:07.216917 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rtd48"] Sep 30 05:40:08 crc kubenswrapper[4956]: I0930 05:40:08.189349 4956 generic.go:334] "Generic (PLEG): container finished" podID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerID="2a31e2492cd50a42798e65fd5b1d25eb2cb2aaf1618ca16e17df087857f0bb20" exitCode=0 Sep 30 05:40:08 crc kubenswrapper[4956]: I0930 05:40:08.189515 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" event={"ID":"5435ca6f-5e89-4884-ab53-526b66bf688e","Type":"ContainerDied","Data":"2a31e2492cd50a42798e65fd5b1d25eb2cb2aaf1618ca16e17df087857f0bb20"} Sep 30 05:40:08 crc kubenswrapper[4956]: I0930 05:40:08.353369 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f523f25-bc41-44dc-b311-bf6df1cbc2ee" path="/var/lib/kubelet/pods/8f523f25-bc41-44dc-b311-bf6df1cbc2ee/volumes" Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.476330 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.497824 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdt5d\" (UniqueName: \"kubernetes.io/projected/5435ca6f-5e89-4884-ab53-526b66bf688e-kube-api-access-fdt5d\") pod \"5435ca6f-5e89-4884-ab53-526b66bf688e\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.497886 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-bundle\") pod \"5435ca6f-5e89-4884-ab53-526b66bf688e\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.498682 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-util\") pod \"5435ca6f-5e89-4884-ab53-526b66bf688e\" (UID: \"5435ca6f-5e89-4884-ab53-526b66bf688e\") " Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.500903 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-bundle" (OuterVolumeSpecName: "bundle") pod "5435ca6f-5e89-4884-ab53-526b66bf688e" (UID: "5435ca6f-5e89-4884-ab53-526b66bf688e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.506357 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5435ca6f-5e89-4884-ab53-526b66bf688e-kube-api-access-fdt5d" (OuterVolumeSpecName: "kube-api-access-fdt5d") pod "5435ca6f-5e89-4884-ab53-526b66bf688e" (UID: "5435ca6f-5e89-4884-ab53-526b66bf688e"). InnerVolumeSpecName "kube-api-access-fdt5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.508479 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdt5d\" (UniqueName: \"kubernetes.io/projected/5435ca6f-5e89-4884-ab53-526b66bf688e-kube-api-access-fdt5d\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.509312 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.515394 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-util" (OuterVolumeSpecName: "util") pod "5435ca6f-5e89-4884-ab53-526b66bf688e" (UID: "5435ca6f-5e89-4884-ab53-526b66bf688e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:40:09 crc kubenswrapper[4956]: I0930 05:40:09.610470 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5435ca6f-5e89-4884-ab53-526b66bf688e-util\") on node \"crc\" DevicePath \"\"" Sep 30 05:40:10 crc kubenswrapper[4956]: I0930 05:40:10.206624 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" event={"ID":"5435ca6f-5e89-4884-ab53-526b66bf688e","Type":"ContainerDied","Data":"a24da6d15a6f17441e4c01673dbbe818cd8dd4a017b12f0c64e1a786f780174c"} Sep 30 05:40:10 crc kubenswrapper[4956]: I0930 05:40:10.206686 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24da6d15a6f17441e4c01673dbbe818cd8dd4a017b12f0c64e1a786f780174c" Sep 30 05:40:10 crc kubenswrapper[4956]: I0930 05:40:10.206810 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh" Sep 30 05:40:18 crc kubenswrapper[4956]: I0930 05:40:18.073451 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:40:18 crc kubenswrapper[4956]: I0930 05:40:18.073791 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.948344 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf"] Sep 30 05:40:19 crc kubenswrapper[4956]: E0930 05:40:19.948896 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerName="pull" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.948912 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerName="pull" Sep 30 05:40:19 crc kubenswrapper[4956]: E0930 05:40:19.948930 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerName="util" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.948938 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerName="util" Sep 30 05:40:19 crc kubenswrapper[4956]: E0930 05:40:19.948951 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerName="extract" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.948959 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerName="extract" Sep 30 05:40:19 crc kubenswrapper[4956]: E0930 05:40:19.948974 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f523f25-bc41-44dc-b311-bf6df1cbc2ee" containerName="console" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.948982 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f523f25-bc41-44dc-b311-bf6df1cbc2ee" containerName="console" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.949104 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5435ca6f-5e89-4884-ab53-526b66bf688e" containerName="extract" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.949134 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f523f25-bc41-44dc-b311-bf6df1cbc2ee" containerName="console" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.949607 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.951715 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.951742 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.951758 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.951830 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vrpmq" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.951848 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 05:40:19 crc kubenswrapper[4956]: I0930 05:40:19.964628 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf"] Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.035557 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wx55\" (UniqueName: \"kubernetes.io/projected/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-kube-api-access-9wx55\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.035689 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-webhook-cert\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.035743 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-apiservice-cert\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.137031 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-webhook-cert\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.137086 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-apiservice-cert\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.137152 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wx55\" (UniqueName: \"kubernetes.io/projected/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-kube-api-access-9wx55\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.143538 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-apiservice-cert\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.144804 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-webhook-cert\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.156953 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wx55\" (UniqueName: \"kubernetes.io/projected/1cc7609e-41b3-4fb2-98f4-6cc743299a2f-kube-api-access-9wx55\") pod \"metallb-operator-controller-manager-7478c46d8f-h6xxf\" (UID: \"1cc7609e-41b3-4fb2-98f4-6cc743299a2f\") " pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.174377 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6"] Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.175103 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.179684 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.179691 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.180434 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qszmg" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.201262 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6"] Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.238390 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/889c063f-2550-48ed-957c-150f8f1192e3-apiservice-cert\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.238469 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74mgk\" (UniqueName: \"kubernetes.io/projected/889c063f-2550-48ed-957c-150f8f1192e3-kube-api-access-74mgk\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.238512 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/889c063f-2550-48ed-957c-150f8f1192e3-webhook-cert\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.266702 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.340861 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74mgk\" (UniqueName: \"kubernetes.io/projected/889c063f-2550-48ed-957c-150f8f1192e3-kube-api-access-74mgk\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.340921 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/889c063f-2550-48ed-957c-150f8f1192e3-webhook-cert\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.340960 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/889c063f-2550-48ed-957c-150f8f1192e3-apiservice-cert\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.362907 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/889c063f-2550-48ed-957c-150f8f1192e3-webhook-cert\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.378084 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/889c063f-2550-48ed-957c-150f8f1192e3-apiservice-cert\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.385160 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74mgk\" (UniqueName: \"kubernetes.io/projected/889c063f-2550-48ed-957c-150f8f1192e3-kube-api-access-74mgk\") pod \"metallb-operator-webhook-server-6bbf45b88f-6bpm6\" (UID: \"889c063f-2550-48ed-957c-150f8f1192e3\") " pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.503978 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.710848 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf"] Sep 30 05:40:20 crc kubenswrapper[4956]: I0930 05:40:20.934566 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6"] Sep 30 05:40:20 crc kubenswrapper[4956]: W0930 05:40:20.941917 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod889c063f_2550_48ed_957c_150f8f1192e3.slice/crio-819afa10a6b50fc71359506c90ced302c1f83c6b9b3c3edc3054c629d8f609d8 WatchSource:0}: Error finding container 819afa10a6b50fc71359506c90ced302c1f83c6b9b3c3edc3054c629d8f609d8: Status 404 returned error can't find the container with id 819afa10a6b50fc71359506c90ced302c1f83c6b9b3c3edc3054c629d8f609d8 Sep 30 05:40:21 crc kubenswrapper[4956]: I0930 05:40:21.267865 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" event={"ID":"889c063f-2550-48ed-957c-150f8f1192e3","Type":"ContainerStarted","Data":"819afa10a6b50fc71359506c90ced302c1f83c6b9b3c3edc3054c629d8f609d8"} Sep 30 05:40:21 crc kubenswrapper[4956]: I0930 05:40:21.269031 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" event={"ID":"1cc7609e-41b3-4fb2-98f4-6cc743299a2f","Type":"ContainerStarted","Data":"432776502d51beadda00a25f8092c2106479428a33370b88f8ceb468ad3fd6f4"} Sep 30 05:40:26 crc kubenswrapper[4956]: I0930 05:40:26.305584 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" event={"ID":"889c063f-2550-48ed-957c-150f8f1192e3","Type":"ContainerStarted","Data":"c77c15980ee153099fded795868c05adbf0a12477845e6706871f0be4c905884"} Sep 30 05:40:26 crc kubenswrapper[4956]: I0930 05:40:26.306226 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:26 crc kubenswrapper[4956]: I0930 05:40:26.307029 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" event={"ID":"1cc7609e-41b3-4fb2-98f4-6cc743299a2f","Type":"ContainerStarted","Data":"7dee382b6dd6fa0cd2e2eb93d4b0cc23e44583c6ec9349e3c97dafc8145c6394"} Sep 30 05:40:26 crc kubenswrapper[4956]: I0930 05:40:26.307459 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:40:26 crc kubenswrapper[4956]: I0930 05:40:26.351318 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" podStartSLOduration=3.001438734 podStartE2EDuration="7.35130072s" podCreationTimestamp="2025-09-30 05:40:19 +0000 UTC" firstStartedPulling="2025-09-30 05:40:20.729589593 +0000 UTC m=+691.056710118" lastFinishedPulling="2025-09-30 05:40:25.079451578 +0000 UTC m=+695.406572104" observedRunningTime="2025-09-30 05:40:26.351103143 +0000 UTC m=+696.678223658" watchObservedRunningTime="2025-09-30 05:40:26.35130072 +0000 UTC m=+696.678421245" Sep 30 05:40:26 crc kubenswrapper[4956]: I0930 05:40:26.354540 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" podStartSLOduration=2.207630843 podStartE2EDuration="6.354532883s" podCreationTimestamp="2025-09-30 05:40:20 +0000 UTC" firstStartedPulling="2025-09-30 05:40:20.944698739 +0000 UTC m=+691.271819264" lastFinishedPulling="2025-09-30 05:40:25.091600769 +0000 UTC m=+695.418721304" observedRunningTime="2025-09-30 05:40:26.326837094 +0000 UTC m=+696.653957619" watchObservedRunningTime="2025-09-30 05:40:26.354532883 +0000 UTC m=+696.681653408" Sep 30 05:40:40 crc kubenswrapper[4956]: I0930 05:40:40.508545 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6bbf45b88f-6bpm6" Sep 30 05:40:48 crc kubenswrapper[4956]: I0930 05:40:48.073413 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:40:48 crc kubenswrapper[4956]: I0930 05:40:48.074066 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.269538 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7478c46d8f-h6xxf" Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.957697 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5mnk7"] Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.960375 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.963909 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.964161 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-c7c4j" Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.964286 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.980459 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt"] Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.981469 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:00 crc kubenswrapper[4956]: I0930 05:41:00.985544 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.004064 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt"] Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.044473 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2kv9\" (UniqueName: \"kubernetes.io/projected/26860b1e-eab3-4a86-b87d-3c52529f70dd-kube-api-access-q2kv9\") pod \"frr-k8s-webhook-server-5478bdb765-7r8bt\" (UID: \"26860b1e-eab3-4a86-b87d-3c52529f70dd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.044586 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26860b1e-eab3-4a86-b87d-3c52529f70dd-cert\") pod \"frr-k8s-webhook-server-5478bdb765-7r8bt\" (UID: \"26860b1e-eab3-4a86-b87d-3c52529f70dd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.082682 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5pvg9"] Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.083608 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.088515 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.088532 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.088857 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4q695" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.090176 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.096059 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-ntsz7"] Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.096943 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.099336 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.109997 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-ntsz7"] Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.148767 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26860b1e-eab3-4a86-b87d-3c52529f70dd-cert\") pod \"frr-k8s-webhook-server-5478bdb765-7r8bt\" (UID: \"26860b1e-eab3-4a86-b87d-3c52529f70dd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.148829 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-metrics\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.148861 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-frr-sockets\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.148899 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2kv9\" (UniqueName: \"kubernetes.io/projected/26860b1e-eab3-4a86-b87d-3c52529f70dd-kube-api-access-q2kv9\") pod \"frr-k8s-webhook-server-5478bdb765-7r8bt\" (UID: \"26860b1e-eab3-4a86-b87d-3c52529f70dd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.148922 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/809472fb-9344-46be-81f4-808a1fcc16c6-metrics-certs\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: E0930 05:41:01.149016 4956 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Sep 30 05:41:01 crc kubenswrapper[4956]: E0930 05:41:01.149078 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26860b1e-eab3-4a86-b87d-3c52529f70dd-cert podName:26860b1e-eab3-4a86-b87d-3c52529f70dd nodeName:}" failed. No retries permitted until 2025-09-30 05:41:01.649054825 +0000 UTC m=+731.976175350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26860b1e-eab3-4a86-b87d-3c52529f70dd-cert") pod "frr-k8s-webhook-server-5478bdb765-7r8bt" (UID: "26860b1e-eab3-4a86-b87d-3c52529f70dd") : secret "frr-k8s-webhook-server-cert" not found Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.149406 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/809472fb-9344-46be-81f4-808a1fcc16c6-frr-startup\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.149483 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-reloader\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.149512 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnsj\" (UniqueName: \"kubernetes.io/projected/809472fb-9344-46be-81f4-808a1fcc16c6-kube-api-access-tnnsj\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.149584 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-frr-conf\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.171906 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2kv9\" (UniqueName: \"kubernetes.io/projected/26860b1e-eab3-4a86-b87d-3c52529f70dd-kube-api-access-q2kv9\") pod \"frr-k8s-webhook-server-5478bdb765-7r8bt\" (UID: \"26860b1e-eab3-4a86-b87d-3c52529f70dd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250662 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-metrics\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250714 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-frr-sockets\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250747 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250772 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/809472fb-9344-46be-81f4-808a1fcc16c6-metrics-certs\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250796 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/809472fb-9344-46be-81f4-808a1fcc16c6-frr-startup\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250809 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-metrics-certs\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250832 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-metallb-excludel2\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250852 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f07b182-685f-40c3-961e-eebfbf2d5fe5-metrics-certs\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250868 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-reloader\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250887 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnsj\" (UniqueName: \"kubernetes.io/projected/809472fb-9344-46be-81f4-808a1fcc16c6-kube-api-access-tnnsj\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250906 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f07b182-685f-40c3-961e-eebfbf2d5fe5-cert\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250920 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq426\" (UniqueName: \"kubernetes.io/projected/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-kube-api-access-jq426\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250952 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-frr-conf\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.250976 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pspp\" (UniqueName: \"kubernetes.io/projected/0f07b182-685f-40c3-961e-eebfbf2d5fe5-kube-api-access-7pspp\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.251606 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-metrics\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.251652 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-frr-sockets\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.251862 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-reloader\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.252318 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/809472fb-9344-46be-81f4-808a1fcc16c6-frr-conf\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.252904 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/809472fb-9344-46be-81f4-808a1fcc16c6-frr-startup\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.254358 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/809472fb-9344-46be-81f4-808a1fcc16c6-metrics-certs\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.286418 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnsj\" (UniqueName: \"kubernetes.io/projected/809472fb-9344-46be-81f4-808a1fcc16c6-kube-api-access-tnnsj\") pod \"frr-k8s-5mnk7\" (UID: \"809472fb-9344-46be-81f4-808a1fcc16c6\") " pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.352264 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pspp\" (UniqueName: \"kubernetes.io/projected/0f07b182-685f-40c3-961e-eebfbf2d5fe5-kube-api-access-7pspp\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.352345 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.352372 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-metrics-certs\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.352394 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-metallb-excludel2\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.352414 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f07b182-685f-40c3-961e-eebfbf2d5fe5-metrics-certs\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.352434 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f07b182-685f-40c3-961e-eebfbf2d5fe5-cert\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.352449 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq426\" (UniqueName: \"kubernetes.io/projected/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-kube-api-access-jq426\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: E0930 05:41:01.352850 4956 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 05:41:01 crc kubenswrapper[4956]: E0930 05:41:01.352894 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist podName:c59b762c-1f05-46c5-8d6d-2bf39a8592f0 nodeName:}" failed. No retries permitted until 2025-09-30 05:41:01.852881998 +0000 UTC m=+732.180002523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist") pod "speaker-5pvg9" (UID: "c59b762c-1f05-46c5-8d6d-2bf39a8592f0") : secret "metallb-memberlist" not found Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.354383 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-metallb-excludel2\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.358472 4956 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.358594 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-metrics-certs\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.358851 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f07b182-685f-40c3-961e-eebfbf2d5fe5-metrics-certs\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.366891 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f07b182-685f-40c3-961e-eebfbf2d5fe5-cert\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.370071 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq426\" (UniqueName: \"kubernetes.io/projected/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-kube-api-access-jq426\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.374498 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pspp\" (UniqueName: \"kubernetes.io/projected/0f07b182-685f-40c3-961e-eebfbf2d5fe5-kube-api-access-7pspp\") pod \"controller-5d688f5ffc-ntsz7\" (UID: \"0f07b182-685f-40c3-961e-eebfbf2d5fe5\") " pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.410931 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.581880 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.657191 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26860b1e-eab3-4a86-b87d-3c52529f70dd-cert\") pod \"frr-k8s-webhook-server-5478bdb765-7r8bt\" (UID: \"26860b1e-eab3-4a86-b87d-3c52529f70dd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.662609 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26860b1e-eab3-4a86-b87d-3c52529f70dd-cert\") pod \"frr-k8s-webhook-server-5478bdb765-7r8bt\" (UID: \"26860b1e-eab3-4a86-b87d-3c52529f70dd\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.801363 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-ntsz7"] Sep 30 05:41:01 crc kubenswrapper[4956]: W0930 05:41:01.806017 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f07b182_685f_40c3_961e_eebfbf2d5fe5.slice/crio-3304cf89f7229ed085262f9d8860619a3d7cd969c9e8e805a29b3641b9973ca2 WatchSource:0}: Error finding container 3304cf89f7229ed085262f9d8860619a3d7cd969c9e8e805a29b3641b9973ca2: Status 404 returned error can't find the container with id 3304cf89f7229ed085262f9d8860619a3d7cd969c9e8e805a29b3641b9973ca2 Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.858937 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:01 crc kubenswrapper[4956]: E0930 05:41:01.859086 4956 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 05:41:01 crc kubenswrapper[4956]: E0930 05:41:01.859171 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist podName:c59b762c-1f05-46c5-8d6d-2bf39a8592f0 nodeName:}" failed. No retries permitted until 2025-09-30 05:41:02.859151691 +0000 UTC m=+733.186272226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist") pod "speaker-5pvg9" (UID: "c59b762c-1f05-46c5-8d6d-2bf39a8592f0") : secret "metallb-memberlist" not found Sep 30 05:41:01 crc kubenswrapper[4956]: I0930 05:41:01.898591 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.107451 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt"] Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.516695 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" event={"ID":"26860b1e-eab3-4a86-b87d-3c52529f70dd","Type":"ContainerStarted","Data":"9f11051f5bf782af1e62875be28fd6d448f0bb49f9d33b2b8f3bbdabe4ea371d"} Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.517791 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerStarted","Data":"24950441ed8225544214f5869592768b22c9e5c9a73446de4778e21722c432e9"} Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.519734 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ntsz7" event={"ID":"0f07b182-685f-40c3-961e-eebfbf2d5fe5","Type":"ContainerStarted","Data":"33af405bf25b09b23bde6148e0307b7ed7fd7e9101e50d257e212b2221d71b7c"} Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.519778 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ntsz7" event={"ID":"0f07b182-685f-40c3-961e-eebfbf2d5fe5","Type":"ContainerStarted","Data":"dcea644357751de95af19e74ca1ca74462c724326860fbdf142f64d62efb225e"} Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.519797 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ntsz7" event={"ID":"0f07b182-685f-40c3-961e-eebfbf2d5fe5","Type":"ContainerStarted","Data":"3304cf89f7229ed085262f9d8860619a3d7cd969c9e8e805a29b3641b9973ca2"} Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.519829 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.533968 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-ntsz7" podStartSLOduration=1.533942245 podStartE2EDuration="1.533942245s" podCreationTimestamp="2025-09-30 05:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:41:02.533328035 +0000 UTC m=+732.860448600" watchObservedRunningTime="2025-09-30 05:41:02.533942245 +0000 UTC m=+732.861062800" Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.870385 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.877141 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c59b762c-1f05-46c5-8d6d-2bf39a8592f0-memberlist\") pod \"speaker-5pvg9\" (UID: \"c59b762c-1f05-46c5-8d6d-2bf39a8592f0\") " pod="metallb-system/speaker-5pvg9" Sep 30 05:41:02 crc kubenswrapper[4956]: I0930 05:41:02.898818 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5pvg9" Sep 30 05:41:02 crc kubenswrapper[4956]: W0930 05:41:02.923317 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc59b762c_1f05_46c5_8d6d_2bf39a8592f0.slice/crio-427e2624591562526905c15b83e742f197edb1a12634cd4e982811c3327e2846 WatchSource:0}: Error finding container 427e2624591562526905c15b83e742f197edb1a12634cd4e982811c3327e2846: Status 404 returned error can't find the container with id 427e2624591562526905c15b83e742f197edb1a12634cd4e982811c3327e2846 Sep 30 05:41:03 crc kubenswrapper[4956]: I0930 05:41:03.531943 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5pvg9" event={"ID":"c59b762c-1f05-46c5-8d6d-2bf39a8592f0","Type":"ContainerStarted","Data":"0d79c6c5def59193e2551ab9e956e61286352b04df6c46035eab6c4c49df8cff"} Sep 30 05:41:03 crc kubenswrapper[4956]: I0930 05:41:03.532362 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5pvg9" event={"ID":"c59b762c-1f05-46c5-8d6d-2bf39a8592f0","Type":"ContainerStarted","Data":"230369a340c6a11ad6744aa2430dde6f6fe3c7fe2e6fbc3b53e812976badf227"} Sep 30 05:41:03 crc kubenswrapper[4956]: I0930 05:41:03.532373 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5pvg9" event={"ID":"c59b762c-1f05-46c5-8d6d-2bf39a8592f0","Type":"ContainerStarted","Data":"427e2624591562526905c15b83e742f197edb1a12634cd4e982811c3327e2846"} Sep 30 05:41:03 crc kubenswrapper[4956]: I0930 05:41:03.532945 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5pvg9" Sep 30 05:41:03 crc kubenswrapper[4956]: I0930 05:41:03.556655 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5pvg9" podStartSLOduration=2.556639126 podStartE2EDuration="2.556639126s" podCreationTimestamp="2025-09-30 05:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:41:03.552353939 +0000 UTC m=+733.879474464" watchObservedRunningTime="2025-09-30 05:41:03.556639126 +0000 UTC m=+733.883759661" Sep 30 05:41:09 crc kubenswrapper[4956]: I0930 05:41:09.581383 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" event={"ID":"26860b1e-eab3-4a86-b87d-3c52529f70dd","Type":"ContainerStarted","Data":"263c5a3660504f4d2948c2f48416b66148a51302bb0cbd90705a2bfe87dfbb14"} Sep 30 05:41:09 crc kubenswrapper[4956]: I0930 05:41:09.582108 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:09 crc kubenswrapper[4956]: I0930 05:41:09.583339 4956 generic.go:334] "Generic (PLEG): container finished" podID="809472fb-9344-46be-81f4-808a1fcc16c6" containerID="906548ca359fd832fe428bb90d8629818bd454820c40e20b04fdf49bfe7e582e" exitCode=0 Sep 30 05:41:09 crc kubenswrapper[4956]: I0930 05:41:09.583367 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerDied","Data":"906548ca359fd832fe428bb90d8629818bd454820c40e20b04fdf49bfe7e582e"} Sep 30 05:41:09 crc kubenswrapper[4956]: I0930 05:41:09.609020 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" podStartSLOduration=2.787877796 podStartE2EDuration="9.608990007s" podCreationTimestamp="2025-09-30 05:41:00 +0000 UTC" firstStartedPulling="2025-09-30 05:41:02.112180445 +0000 UTC m=+732.439300970" lastFinishedPulling="2025-09-30 05:41:08.933292656 +0000 UTC m=+739.260413181" observedRunningTime="2025-09-30 05:41:09.601990482 +0000 UTC m=+739.929111007" watchObservedRunningTime="2025-09-30 05:41:09.608990007 +0000 UTC m=+739.936110552" Sep 30 05:41:10 crc kubenswrapper[4956]: I0930 05:41:10.591465 4956 generic.go:334] "Generic (PLEG): container finished" podID="809472fb-9344-46be-81f4-808a1fcc16c6" containerID="9fbac82e48e2b6dade3d88b892884338123cc945ba57fc176c699966c49c9b80" exitCode=0 Sep 30 05:41:10 crc kubenswrapper[4956]: I0930 05:41:10.591513 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerDied","Data":"9fbac82e48e2b6dade3d88b892884338123cc945ba57fc176c699966c49c9b80"} Sep 30 05:41:11 crc kubenswrapper[4956]: I0930 05:41:11.418891 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-ntsz7" Sep 30 05:41:11 crc kubenswrapper[4956]: I0930 05:41:11.600598 4956 generic.go:334] "Generic (PLEG): container finished" podID="809472fb-9344-46be-81f4-808a1fcc16c6" containerID="016e17decdcd6a5561a2ed2507761837fb43ffc1bfd9846e449f520120ebfefd" exitCode=0 Sep 30 05:41:11 crc kubenswrapper[4956]: I0930 05:41:11.600731 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerDied","Data":"016e17decdcd6a5561a2ed2507761837fb43ffc1bfd9846e449f520120ebfefd"} Sep 30 05:41:12 crc kubenswrapper[4956]: I0930 05:41:12.615856 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerStarted","Data":"31ee7943f423868db8378316370cf8f5fc2397f5f815d1b2f7b9fc0244f67671"} Sep 30 05:41:12 crc kubenswrapper[4956]: I0930 05:41:12.615900 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerStarted","Data":"16c1321de7db6fcc7957ba68b73f2ebe4667c874e8628ae9f04fa67b33924888"} Sep 30 05:41:12 crc kubenswrapper[4956]: I0930 05:41:12.615916 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerStarted","Data":"040c7be1094bd5f1f32a906e924228ec60272e6745eeabbd4e4805bfaf66f324"} Sep 30 05:41:12 crc kubenswrapper[4956]: I0930 05:41:12.615928 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerStarted","Data":"cc9a1caf893c2ad6a4e269108e46df2684fdb4c5e91e26afe3d524fe4e5bfb03"} Sep 30 05:41:12 crc kubenswrapper[4956]: I0930 05:41:12.615940 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerStarted","Data":"a834e7230ee90434cb61c49e9de032c5699df1e3a0862d8687098705075b03c5"} Sep 30 05:41:13 crc kubenswrapper[4956]: I0930 05:41:13.623470 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5mnk7" event={"ID":"809472fb-9344-46be-81f4-808a1fcc16c6","Type":"ContainerStarted","Data":"6d036cf2d9ceabf577223960106e1ee328cbb5d2bdd69e9cb3c95a0f3c3af465"} Sep 30 05:41:13 crc kubenswrapper[4956]: I0930 05:41:13.623782 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:13 crc kubenswrapper[4956]: I0930 05:41:13.670191 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5mnk7" podStartSLOduration=6.443136422 podStartE2EDuration="13.670082082s" podCreationTimestamp="2025-09-30 05:41:00 +0000 UTC" firstStartedPulling="2025-09-30 05:41:01.714033532 +0000 UTC m=+732.041154047" lastFinishedPulling="2025-09-30 05:41:08.940979182 +0000 UTC m=+739.268099707" observedRunningTime="2025-09-30 05:41:13.655462812 +0000 UTC m=+743.982583367" watchObservedRunningTime="2025-09-30 05:41:13.670082082 +0000 UTC m=+743.997202617" Sep 30 05:41:16 crc kubenswrapper[4956]: I0930 05:41:16.582305 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:16 crc kubenswrapper[4956]: I0930 05:41:16.647096 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.073434 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.073497 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.073543 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.074183 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb7b5906427c53280d2cfba1bc232da8c4f2d2136d0c44bbfff91603d668f7d8"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.074250 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://cb7b5906427c53280d2cfba1bc232da8c4f2d2136d0c44bbfff91603d668f7d8" gracePeriod=600 Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.668010 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="cb7b5906427c53280d2cfba1bc232da8c4f2d2136d0c44bbfff91603d668f7d8" exitCode=0 Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.668396 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"cb7b5906427c53280d2cfba1bc232da8c4f2d2136d0c44bbfff91603d668f7d8"} Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.668439 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"357cd0f449d0c4885ee791783ca31e326eaaa7e5f1a04d708b2435c26c26f499"} Sep 30 05:41:18 crc kubenswrapper[4956]: I0930 05:41:18.668457 4956 scope.go:117] "RemoveContainer" containerID="bf32bc96b878134a19be7c69abbea20183ca590cbab32abb08f319a6b44d2808" Sep 30 05:41:21 crc kubenswrapper[4956]: I0930 05:41:21.585770 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5mnk7" Sep 30 05:41:21 crc kubenswrapper[4956]: I0930 05:41:21.903365 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-7r8bt" Sep 30 05:41:22 crc kubenswrapper[4956]: I0930 05:41:22.902346 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5pvg9" Sep 30 05:41:23 crc kubenswrapper[4956]: I0930 05:41:23.797882 4956 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 05:41:25 crc kubenswrapper[4956]: I0930 05:41:25.754533 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cp4pt"] Sep 30 05:41:25 crc kubenswrapper[4956]: I0930 05:41:25.756068 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cp4pt" Sep 30 05:41:25 crc kubenswrapper[4956]: I0930 05:41:25.761728 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 05:41:25 crc kubenswrapper[4956]: I0930 05:41:25.761926 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hr4hz" Sep 30 05:41:25 crc kubenswrapper[4956]: I0930 05:41:25.762078 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 05:41:25 crc kubenswrapper[4956]: I0930 05:41:25.766036 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cp4pt"] Sep 30 05:41:25 crc kubenswrapper[4956]: I0930 05:41:25.904196 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4w8j\" (UniqueName: \"kubernetes.io/projected/ced23828-32fc-4f97-8b72-26889c78574e-kube-api-access-q4w8j\") pod \"openstack-operator-index-cp4pt\" (UID: \"ced23828-32fc-4f97-8b72-26889c78574e\") " pod="openstack-operators/openstack-operator-index-cp4pt" Sep 30 05:41:26 crc kubenswrapper[4956]: I0930 05:41:26.005834 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4w8j\" (UniqueName: \"kubernetes.io/projected/ced23828-32fc-4f97-8b72-26889c78574e-kube-api-access-q4w8j\") pod \"openstack-operator-index-cp4pt\" (UID: \"ced23828-32fc-4f97-8b72-26889c78574e\") " pod="openstack-operators/openstack-operator-index-cp4pt" Sep 30 05:41:26 crc kubenswrapper[4956]: I0930 05:41:26.036380 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4w8j\" (UniqueName: \"kubernetes.io/projected/ced23828-32fc-4f97-8b72-26889c78574e-kube-api-access-q4w8j\") pod \"openstack-operator-index-cp4pt\" (UID: \"ced23828-32fc-4f97-8b72-26889c78574e\") " pod="openstack-operators/openstack-operator-index-cp4pt" Sep 30 05:41:26 crc kubenswrapper[4956]: I0930 05:41:26.083768 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cp4pt" Sep 30 05:41:26 crc kubenswrapper[4956]: I0930 05:41:26.496050 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cp4pt"] Sep 30 05:41:26 crc kubenswrapper[4956]: W0930 05:41:26.511327 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced23828_32fc_4f97_8b72_26889c78574e.slice/crio-6f3e823b13b38239136a0c4e84f1f55eec8b86a509453c0bc90a64bc475f8cf6 WatchSource:0}: Error finding container 6f3e823b13b38239136a0c4e84f1f55eec8b86a509453c0bc90a64bc475f8cf6: Status 404 returned error can't find the container with id 6f3e823b13b38239136a0c4e84f1f55eec8b86a509453c0bc90a64bc475f8cf6 Sep 30 05:41:26 crc kubenswrapper[4956]: I0930 05:41:26.729439 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cp4pt" event={"ID":"ced23828-32fc-4f97-8b72-26889c78574e","Type":"ContainerStarted","Data":"6f3e823b13b38239136a0c4e84f1f55eec8b86a509453c0bc90a64bc475f8cf6"} Sep 30 05:41:27 crc kubenswrapper[4956]: I0930 05:41:27.746043 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cp4pt" event={"ID":"ced23828-32fc-4f97-8b72-26889c78574e","Type":"ContainerStarted","Data":"5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf"} Sep 30 05:41:27 crc kubenswrapper[4956]: I0930 05:41:27.767702 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cp4pt" podStartSLOduration=1.939759024 podStartE2EDuration="2.767679722s" podCreationTimestamp="2025-09-30 05:41:25 +0000 UTC" firstStartedPulling="2025-09-30 05:41:26.513180259 +0000 UTC m=+756.840300784" lastFinishedPulling="2025-09-30 05:41:27.341100957 +0000 UTC m=+757.668221482" observedRunningTime="2025-09-30 05:41:27.763653203 +0000 UTC m=+758.090773738" watchObservedRunningTime="2025-09-30 05:41:27.767679722 +0000 UTC m=+758.094800257" Sep 30 05:41:29 crc kubenswrapper[4956]: I0930 05:41:29.946498 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-cp4pt"] Sep 30 05:41:29 crc kubenswrapper[4956]: I0930 05:41:29.947512 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-cp4pt" podUID="ced23828-32fc-4f97-8b72-26889c78574e" containerName="registry-server" containerID="cri-o://5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf" gracePeriod=2 Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.366428 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cp4pt" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.466445 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4w8j\" (UniqueName: \"kubernetes.io/projected/ced23828-32fc-4f97-8b72-26889c78574e-kube-api-access-q4w8j\") pod \"ced23828-32fc-4f97-8b72-26889c78574e\" (UID: \"ced23828-32fc-4f97-8b72-26889c78574e\") " Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.476776 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced23828-32fc-4f97-8b72-26889c78574e-kube-api-access-q4w8j" (OuterVolumeSpecName: "kube-api-access-q4w8j") pod "ced23828-32fc-4f97-8b72-26889c78574e" (UID: "ced23828-32fc-4f97-8b72-26889c78574e"). InnerVolumeSpecName "kube-api-access-q4w8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.553660 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4bg6n"] Sep 30 05:41:30 crc kubenswrapper[4956]: E0930 05:41:30.553952 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced23828-32fc-4f97-8b72-26889c78574e" containerName="registry-server" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.553972 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced23828-32fc-4f97-8b72-26889c78574e" containerName="registry-server" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.554220 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced23828-32fc-4f97-8b72-26889c78574e" containerName="registry-server" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.554720 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.563201 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4bg6n"] Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.567789 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4w8j\" (UniqueName: \"kubernetes.io/projected/ced23828-32fc-4f97-8b72-26889c78574e-kube-api-access-q4w8j\") on node \"crc\" DevicePath \"\"" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.668486 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbcr\" (UniqueName: \"kubernetes.io/projected/dbd8b6e7-61f4-4a07-92ab-ed42a432df93-kube-api-access-4lbcr\") pod \"openstack-operator-index-4bg6n\" (UID: \"dbd8b6e7-61f4-4a07-92ab-ed42a432df93\") " pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.765143 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cp4pt" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.765223 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cp4pt" event={"ID":"ced23828-32fc-4f97-8b72-26889c78574e","Type":"ContainerDied","Data":"5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf"} Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.765696 4956 scope.go:117] "RemoveContainer" containerID="5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.769298 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbcr\" (UniqueName: \"kubernetes.io/projected/dbd8b6e7-61f4-4a07-92ab-ed42a432df93-kube-api-access-4lbcr\") pod \"openstack-operator-index-4bg6n\" (UID: \"dbd8b6e7-61f4-4a07-92ab-ed42a432df93\") " pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.778521 4956 generic.go:334] "Generic (PLEG): container finished" podID="ced23828-32fc-4f97-8b72-26889c78574e" containerID="5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf" exitCode=0 Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.778753 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cp4pt" event={"ID":"ced23828-32fc-4f97-8b72-26889c78574e","Type":"ContainerDied","Data":"6f3e823b13b38239136a0c4e84f1f55eec8b86a509453c0bc90a64bc475f8cf6"} Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.784439 4956 scope.go:117] "RemoveContainer" containerID="5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf" Sep 30 05:41:30 crc kubenswrapper[4956]: E0930 05:41:30.786341 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf\": container with ID starting with 5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf not found: ID does not exist" containerID="5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.786490 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf"} err="failed to get container status \"5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf\": rpc error: code = NotFound desc = could not find container \"5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf\": container with ID starting with 5a72ac6774a92194db928424423fba9ec8002e8e0f0a5ba416eeca0474a73cdf not found: ID does not exist" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.796306 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbcr\" (UniqueName: \"kubernetes.io/projected/dbd8b6e7-61f4-4a07-92ab-ed42a432df93-kube-api-access-4lbcr\") pod \"openstack-operator-index-4bg6n\" (UID: \"dbd8b6e7-61f4-4a07-92ab-ed42a432df93\") " pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.804860 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-cp4pt"] Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.807877 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-cp4pt"] Sep 30 05:41:30 crc kubenswrapper[4956]: I0930 05:41:30.875471 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:31 crc kubenswrapper[4956]: I0930 05:41:31.114818 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4bg6n"] Sep 30 05:41:31 crc kubenswrapper[4956]: I0930 05:41:31.787091 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4bg6n" event={"ID":"dbd8b6e7-61f4-4a07-92ab-ed42a432df93","Type":"ContainerStarted","Data":"b4f86cc7af802e2fd473c4a84050e8e8c6c0ab43c8ef50fcbb1fc08cf449afd2"} Sep 30 05:41:32 crc kubenswrapper[4956]: I0930 05:41:32.351224 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced23828-32fc-4f97-8b72-26889c78574e" path="/var/lib/kubelet/pods/ced23828-32fc-4f97-8b72-26889c78574e/volumes" Sep 30 05:41:32 crc kubenswrapper[4956]: I0930 05:41:32.800433 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4bg6n" event={"ID":"dbd8b6e7-61f4-4a07-92ab-ed42a432df93","Type":"ContainerStarted","Data":"04955667d96d3dd1565d38a92acf1973523f600c7b338af77199394908bd265a"} Sep 30 05:41:32 crc kubenswrapper[4956]: I0930 05:41:32.824801 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4bg6n" podStartSLOduration=2.10800467 podStartE2EDuration="2.82476803s" podCreationTimestamp="2025-09-30 05:41:30 +0000 UTC" firstStartedPulling="2025-09-30 05:41:31.124227718 +0000 UTC m=+761.451348243" lastFinishedPulling="2025-09-30 05:41:31.840991078 +0000 UTC m=+762.168111603" observedRunningTime="2025-09-30 05:41:32.816763433 +0000 UTC m=+763.143884018" watchObservedRunningTime="2025-09-30 05:41:32.82476803 +0000 UTC m=+763.151888585" Sep 30 05:41:40 crc kubenswrapper[4956]: I0930 05:41:40.876564 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:40 crc kubenswrapper[4956]: I0930 05:41:40.877051 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:40 crc kubenswrapper[4956]: I0930 05:41:40.913649 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:41 crc kubenswrapper[4956]: I0930 05:41:41.894433 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-4bg6n" Sep 30 05:41:48 crc kubenswrapper[4956]: I0930 05:41:48.901459 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn"] Sep 30 05:41:48 crc kubenswrapper[4956]: I0930 05:41:48.903096 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:48 crc kubenswrapper[4956]: I0930 05:41:48.949424 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jnn7t" Sep 30 05:41:48 crc kubenswrapper[4956]: I0930 05:41:48.971005 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn"] Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.050876 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72bb\" (UniqueName: \"kubernetes.io/projected/7b070be8-a2e3-4b53-bfa6-502bc50daf72-kube-api-access-d72bb\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.050981 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.051526 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.152341 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72bb\" (UniqueName: \"kubernetes.io/projected/7b070be8-a2e3-4b53-bfa6-502bc50daf72-kube-api-access-d72bb\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.152405 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.152444 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.152847 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.152891 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.171407 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72bb\" (UniqueName: \"kubernetes.io/projected/7b070be8-a2e3-4b53-bfa6-502bc50daf72-kube-api-access-d72bb\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.269755 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.463837 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn"] Sep 30 05:41:49 crc kubenswrapper[4956]: W0930 05:41:49.479132 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b070be8_a2e3_4b53_bfa6_502bc50daf72.slice/crio-d9b94e9c9de4e2396c8fd68afce10a833998d90e77e6323e49a4e06508e53c44 WatchSource:0}: Error finding container d9b94e9c9de4e2396c8fd68afce10a833998d90e77e6323e49a4e06508e53c44: Status 404 returned error can't find the container with id d9b94e9c9de4e2396c8fd68afce10a833998d90e77e6323e49a4e06508e53c44 Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.960100 4956 generic.go:334] "Generic (PLEG): container finished" podID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerID="c837031758fbc34527e4b90252a3d0b584fb293a510c1a7359eb16d003821038" exitCode=0 Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.960177 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" event={"ID":"7b070be8-a2e3-4b53-bfa6-502bc50daf72","Type":"ContainerDied","Data":"c837031758fbc34527e4b90252a3d0b584fb293a510c1a7359eb16d003821038"} Sep 30 05:41:49 crc kubenswrapper[4956]: I0930 05:41:49.960213 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" event={"ID":"7b070be8-a2e3-4b53-bfa6-502bc50daf72","Type":"ContainerStarted","Data":"d9b94e9c9de4e2396c8fd68afce10a833998d90e77e6323e49a4e06508e53c44"} Sep 30 05:41:50 crc kubenswrapper[4956]: I0930 05:41:50.967402 4956 generic.go:334] "Generic (PLEG): container finished" podID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerID="ad012e5c24b7f477d1d9169a6e5d15f53e9d2e82575640b449eaf5e0f163db0f" exitCode=0 Sep 30 05:41:50 crc kubenswrapper[4956]: I0930 05:41:50.967520 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" event={"ID":"7b070be8-a2e3-4b53-bfa6-502bc50daf72","Type":"ContainerDied","Data":"ad012e5c24b7f477d1d9169a6e5d15f53e9d2e82575640b449eaf5e0f163db0f"} Sep 30 05:41:51 crc kubenswrapper[4956]: I0930 05:41:51.976275 4956 generic.go:334] "Generic (PLEG): container finished" podID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerID="bb7f597d1ae18654d141550ed64f930db9afe958b1edc014754a480e7b1b316c" exitCode=0 Sep 30 05:41:51 crc kubenswrapper[4956]: I0930 05:41:51.976377 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" event={"ID":"7b070be8-a2e3-4b53-bfa6-502bc50daf72","Type":"ContainerDied","Data":"bb7f597d1ae18654d141550ed64f930db9afe958b1edc014754a480e7b1b316c"} Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.625654 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nsq7j"] Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.628411 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.643252 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nsq7j"] Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.714950 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-utilities\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.714992 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwlv6\" (UniqueName: \"kubernetes.io/projected/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-kube-api-access-hwlv6\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.715039 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-catalog-content\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.816128 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-catalog-content\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.816220 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-utilities\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.816244 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwlv6\" (UniqueName: \"kubernetes.io/projected/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-kube-api-access-hwlv6\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.816679 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-utilities\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.816755 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-catalog-content\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.835752 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwlv6\" (UniqueName: \"kubernetes.io/projected/e97b25d9-fabb-4c79-a586-c0fa9c73ee2d-kube-api-access-hwlv6\") pod \"community-operators-nsq7j\" (UID: \"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d\") " pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:52 crc kubenswrapper[4956]: I0930 05:41:52.952391 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.270348 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.410941 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nsq7j"] Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.423410 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-bundle\") pod \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.423547 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d72bb\" (UniqueName: \"kubernetes.io/projected/7b070be8-a2e3-4b53-bfa6-502bc50daf72-kube-api-access-d72bb\") pod \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.423621 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-util\") pod \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\" (UID: \"7b070be8-a2e3-4b53-bfa6-502bc50daf72\") " Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.424809 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-bundle" (OuterVolumeSpecName: "bundle") pod "7b070be8-a2e3-4b53-bfa6-502bc50daf72" (UID: "7b070be8-a2e3-4b53-bfa6-502bc50daf72"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.428669 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b070be8-a2e3-4b53-bfa6-502bc50daf72-kube-api-access-d72bb" (OuterVolumeSpecName: "kube-api-access-d72bb") pod "7b070be8-a2e3-4b53-bfa6-502bc50daf72" (UID: "7b070be8-a2e3-4b53-bfa6-502bc50daf72"). InnerVolumeSpecName "kube-api-access-d72bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.437637 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-util" (OuterVolumeSpecName: "util") pod "7b070be8-a2e3-4b53-bfa6-502bc50daf72" (UID: "7b070be8-a2e3-4b53-bfa6-502bc50daf72"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.524878 4956 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.524911 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d72bb\" (UniqueName: \"kubernetes.io/projected/7b070be8-a2e3-4b53-bfa6-502bc50daf72-kube-api-access-d72bb\") on node \"crc\" DevicePath \"\"" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.524920 4956 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b070be8-a2e3-4b53-bfa6-502bc50daf72-util\") on node \"crc\" DevicePath \"\"" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.989336 4956 generic.go:334] "Generic (PLEG): container finished" podID="e97b25d9-fabb-4c79-a586-c0fa9c73ee2d" containerID="5cc67f440d129d43af20965b0f735c1a39c00f803562aac7be8450f2474e12b1" exitCode=0 Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.989519 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsq7j" event={"ID":"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d","Type":"ContainerDied","Data":"5cc67f440d129d43af20965b0f735c1a39c00f803562aac7be8450f2474e12b1"} Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.989668 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsq7j" event={"ID":"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d","Type":"ContainerStarted","Data":"b1c81d019566d060cbdde5fb3b123beebb4a189a6f01a968d2695e073cf9f610"} Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.992487 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" event={"ID":"7b070be8-a2e3-4b53-bfa6-502bc50daf72","Type":"ContainerDied","Data":"d9b94e9c9de4e2396c8fd68afce10a833998d90e77e6323e49a4e06508e53c44"} Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.992530 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9b94e9c9de4e2396c8fd68afce10a833998d90e77e6323e49a4e06508e53c44" Sep 30 05:41:53 crc kubenswrapper[4956]: I0930 05:41:53.992565 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.000491 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf"] Sep 30 05:41:57 crc kubenswrapper[4956]: E0930 05:41:57.000703 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerName="pull" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.000714 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerName="pull" Sep 30 05:41:57 crc kubenswrapper[4956]: E0930 05:41:57.000725 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerName="util" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.000730 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerName="util" Sep 30 05:41:57 crc kubenswrapper[4956]: E0930 05:41:57.000740 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerName="extract" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.000746 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerName="extract" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.000865 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b070be8-a2e3-4b53-bfa6-502bc50daf72" containerName="extract" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.001441 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.005465 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-zqhcg" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.032830 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf"] Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.182497 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj82z\" (UniqueName: \"kubernetes.io/projected/3a68c8b3-216d-4ba4-b841-054d52526caf-kube-api-access-tj82z\") pod \"openstack-operator-controller-operator-56dc567787-c5lxf\" (UID: \"3a68c8b3-216d-4ba4-b841-054d52526caf\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.283917 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj82z\" (UniqueName: \"kubernetes.io/projected/3a68c8b3-216d-4ba4-b841-054d52526caf-kube-api-access-tj82z\") pod \"openstack-operator-controller-operator-56dc567787-c5lxf\" (UID: \"3a68c8b3-216d-4ba4-b841-054d52526caf\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.308293 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj82z\" (UniqueName: \"kubernetes.io/projected/3a68c8b3-216d-4ba4-b841-054d52526caf-kube-api-access-tj82z\") pod \"openstack-operator-controller-operator-56dc567787-c5lxf\" (UID: \"3a68c8b3-216d-4ba4-b841-054d52526caf\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" Sep 30 05:41:57 crc kubenswrapper[4956]: I0930 05:41:57.347864 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" Sep 30 05:41:58 crc kubenswrapper[4956]: I0930 05:41:58.015752 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsq7j" event={"ID":"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d","Type":"ContainerStarted","Data":"26b15380b716b33061549dbba77deb0a8f1bc496c4e44084a06093cbb24224d5"} Sep 30 05:41:58 crc kubenswrapper[4956]: I0930 05:41:58.058522 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf"] Sep 30 05:41:58 crc kubenswrapper[4956]: W0930 05:41:58.060982 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a68c8b3_216d_4ba4_b841_054d52526caf.slice/crio-999a9520021eab39f42b4453d3cd5cb62f1aa865009d1af03beb5e8aaecf7f6a WatchSource:0}: Error finding container 999a9520021eab39f42b4453d3cd5cb62f1aa865009d1af03beb5e8aaecf7f6a: Status 404 returned error can't find the container with id 999a9520021eab39f42b4453d3cd5cb62f1aa865009d1af03beb5e8aaecf7f6a Sep 30 05:41:59 crc kubenswrapper[4956]: I0930 05:41:59.026223 4956 generic.go:334] "Generic (PLEG): container finished" podID="e97b25d9-fabb-4c79-a586-c0fa9c73ee2d" containerID="26b15380b716b33061549dbba77deb0a8f1bc496c4e44084a06093cbb24224d5" exitCode=0 Sep 30 05:41:59 crc kubenswrapper[4956]: I0930 05:41:59.026326 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsq7j" event={"ID":"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d","Type":"ContainerDied","Data":"26b15380b716b33061549dbba77deb0a8f1bc496c4e44084a06093cbb24224d5"} Sep 30 05:41:59 crc kubenswrapper[4956]: I0930 05:41:59.028370 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" event={"ID":"3a68c8b3-216d-4ba4-b841-054d52526caf","Type":"ContainerStarted","Data":"999a9520021eab39f42b4453d3cd5cb62f1aa865009d1af03beb5e8aaecf7f6a"} Sep 30 05:42:00 crc kubenswrapper[4956]: I0930 05:42:00.039224 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsq7j" event={"ID":"e97b25d9-fabb-4c79-a586-c0fa9c73ee2d","Type":"ContainerStarted","Data":"c370bb96b9a172b90b5c6c41cccceaccd39e6beefffd722f61949c52bb037163"} Sep 30 05:42:00 crc kubenswrapper[4956]: I0930 05:42:00.057779 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nsq7j" podStartSLOduration=2.665858494 podStartE2EDuration="8.057762515s" podCreationTimestamp="2025-09-30 05:41:52 +0000 UTC" firstStartedPulling="2025-09-30 05:41:53.991231068 +0000 UTC m=+784.318351603" lastFinishedPulling="2025-09-30 05:41:59.383135099 +0000 UTC m=+789.710255624" observedRunningTime="2025-09-30 05:42:00.05447865 +0000 UTC m=+790.381599175" watchObservedRunningTime="2025-09-30 05:42:00.057762515 +0000 UTC m=+790.384883040" Sep 30 05:42:02 crc kubenswrapper[4956]: I0930 05:42:02.953505 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:42:02 crc kubenswrapper[4956]: I0930 05:42:02.953757 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:42:02 crc kubenswrapper[4956]: I0930 05:42:02.997457 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:42:04 crc kubenswrapper[4956]: I0930 05:42:04.062406 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" event={"ID":"3a68c8b3-216d-4ba4-b841-054d52526caf","Type":"ContainerStarted","Data":"d67ef576fc5e353d96edfc344ab0769c312bdc5d2449514adbe187eae349a6f8"} Sep 30 05:42:06 crc kubenswrapper[4956]: I0930 05:42:06.082622 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" event={"ID":"3a68c8b3-216d-4ba4-b841-054d52526caf","Type":"ContainerStarted","Data":"ce51affaadfa61e357e0d5cace51f23bd19b25ef3a4c0006af3ebf49777d303e"} Sep 30 05:42:06 crc kubenswrapper[4956]: I0930 05:42:06.083672 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" Sep 30 05:42:06 crc kubenswrapper[4956]: I0930 05:42:06.113900 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" podStartSLOduration=2.557966592 podStartE2EDuration="10.113884389s" podCreationTimestamp="2025-09-30 05:41:56 +0000 UTC" firstStartedPulling="2025-09-30 05:41:58.063129633 +0000 UTC m=+788.390250158" lastFinishedPulling="2025-09-30 05:42:05.61904743 +0000 UTC m=+795.946167955" observedRunningTime="2025-09-30 05:42:06.113837567 +0000 UTC m=+796.440958162" watchObservedRunningTime="2025-09-30 05:42:06.113884389 +0000 UTC m=+796.441004914" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.416569 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k4wfk"] Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.417995 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.465200 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4wfk"] Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.490287 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-catalog-content\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.490364 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vskf9\" (UniqueName: \"kubernetes.io/projected/4ff360ad-893d-4f2a-9d65-3b79c965edbd-kube-api-access-vskf9\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.490574 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-utilities\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.591964 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-utilities\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.592047 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-catalog-content\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.592086 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vskf9\" (UniqueName: \"kubernetes.io/projected/4ff360ad-893d-4f2a-9d65-3b79c965edbd-kube-api-access-vskf9\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.592776 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-utilities\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.592992 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-catalog-content\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.630589 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vskf9\" (UniqueName: \"kubernetes.io/projected/4ff360ad-893d-4f2a-9d65-3b79c965edbd-kube-api-access-vskf9\") pod \"certified-operators-k4wfk\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:07 crc kubenswrapper[4956]: I0930 05:42:07.763047 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:08 crc kubenswrapper[4956]: I0930 05:42:08.097690 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-c5lxf" Sep 30 05:42:08 crc kubenswrapper[4956]: I0930 05:42:08.228669 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4wfk"] Sep 30 05:42:09 crc kubenswrapper[4956]: I0930 05:42:09.102423 4956 generic.go:334] "Generic (PLEG): container finished" podID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerID="8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562" exitCode=0 Sep 30 05:42:09 crc kubenswrapper[4956]: I0930 05:42:09.102522 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4wfk" event={"ID":"4ff360ad-893d-4f2a-9d65-3b79c965edbd","Type":"ContainerDied","Data":"8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562"} Sep 30 05:42:09 crc kubenswrapper[4956]: I0930 05:42:09.102807 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4wfk" event={"ID":"4ff360ad-893d-4f2a-9d65-3b79c965edbd","Type":"ContainerStarted","Data":"deb44183df6cfd20d49dd5745630938c28d595bb60303ee8872ea75e41213dab"} Sep 30 05:42:10 crc kubenswrapper[4956]: I0930 05:42:10.110908 4956 generic.go:334] "Generic (PLEG): container finished" podID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerID="dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8" exitCode=0 Sep 30 05:42:10 crc kubenswrapper[4956]: I0930 05:42:10.110947 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4wfk" event={"ID":"4ff360ad-893d-4f2a-9d65-3b79c965edbd","Type":"ContainerDied","Data":"dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8"} Sep 30 05:42:11 crc kubenswrapper[4956]: I0930 05:42:11.117796 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4wfk" event={"ID":"4ff360ad-893d-4f2a-9d65-3b79c965edbd","Type":"ContainerStarted","Data":"6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64"} Sep 30 05:42:11 crc kubenswrapper[4956]: I0930 05:42:11.135843 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k4wfk" podStartSLOduration=2.750197288 podStartE2EDuration="4.135826189s" podCreationTimestamp="2025-09-30 05:42:07 +0000 UTC" firstStartedPulling="2025-09-30 05:42:09.104910241 +0000 UTC m=+799.432030776" lastFinishedPulling="2025-09-30 05:42:10.490539142 +0000 UTC m=+800.817659677" observedRunningTime="2025-09-30 05:42:11.13397771 +0000 UTC m=+801.461098225" watchObservedRunningTime="2025-09-30 05:42:11.135826189 +0000 UTC m=+801.462946714" Sep 30 05:42:12 crc kubenswrapper[4956]: I0930 05:42:12.990146 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nsq7j" Sep 30 05:42:13 crc kubenswrapper[4956]: I0930 05:42:13.490271 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nsq7j"] Sep 30 05:42:13 crc kubenswrapper[4956]: I0930 05:42:13.606217 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nmgpv"] Sep 30 05:42:13 crc kubenswrapper[4956]: I0930 05:42:13.606633 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nmgpv" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerName="registry-server" containerID="cri-o://4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de" gracePeriod=2 Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.063487 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.137626 4956 generic.go:334] "Generic (PLEG): container finished" podID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerID="4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de" exitCode=0 Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.137666 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgpv" event={"ID":"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5","Type":"ContainerDied","Data":"4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de"} Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.137697 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgpv" event={"ID":"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5","Type":"ContainerDied","Data":"62a79d2b5f87f317bd987f9f8954e33ea1a5dc00d2cd86ee2ef11ce84ad637a6"} Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.137714 4956 scope.go:117] "RemoveContainer" containerID="4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.137712 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgpv" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.158316 4956 scope.go:117] "RemoveContainer" containerID="54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.189106 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bmjb\" (UniqueName: \"kubernetes.io/projected/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-kube-api-access-2bmjb\") pod \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.189192 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-utilities\") pod \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.190003 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-utilities" (OuterVolumeSpecName: "utilities") pod "88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" (UID: "88e6e028-4a4e-4be5-b4c1-f28e1d244cc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.190095 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-catalog-content\") pod \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\" (UID: \"88e6e028-4a4e-4be5-b4c1-f28e1d244cc5\") " Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.190531 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.199509 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-kube-api-access-2bmjb" (OuterVolumeSpecName: "kube-api-access-2bmjb") pod "88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" (UID: "88e6e028-4a4e-4be5-b4c1-f28e1d244cc5"). InnerVolumeSpecName "kube-api-access-2bmjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.211988 4956 scope.go:117] "RemoveContainer" containerID="2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.246302 4956 scope.go:117] "RemoveContainer" containerID="4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de" Sep 30 05:42:14 crc kubenswrapper[4956]: E0930 05:42:14.250211 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de\": container with ID starting with 4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de not found: ID does not exist" containerID="4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.250253 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de"} err="failed to get container status \"4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de\": rpc error: code = NotFound desc = could not find container \"4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de\": container with ID starting with 4bca2f0140f861b5b5a84a92c1bab0f2072ab9f1836446742f1371702419b6de not found: ID does not exist" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.250299 4956 scope.go:117] "RemoveContainer" containerID="54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91" Sep 30 05:42:14 crc kubenswrapper[4956]: E0930 05:42:14.252628 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91\": container with ID starting with 54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91 not found: ID does not exist" containerID="54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.252666 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91"} err="failed to get container status \"54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91\": rpc error: code = NotFound desc = could not find container \"54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91\": container with ID starting with 54219330ee092221fdc6c1bb976e4857084b942c240f99f725bb8a577803bd91 not found: ID does not exist" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.252699 4956 scope.go:117] "RemoveContainer" containerID="2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a" Sep 30 05:42:14 crc kubenswrapper[4956]: E0930 05:42:14.255033 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a\": container with ID starting with 2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a not found: ID does not exist" containerID="2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.255085 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a"} err="failed to get container status \"2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a\": rpc error: code = NotFound desc = could not find container \"2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a\": container with ID starting with 2882f216b49aa1d6b463a189939765ca3efaaea0b6d6ea643e1436e065fc469a not found: ID does not exist" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.272798 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" (UID: "88e6e028-4a4e-4be5-b4c1-f28e1d244cc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.291472 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.291496 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bmjb\" (UniqueName: \"kubernetes.io/projected/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5-kube-api-access-2bmjb\") on node \"crc\" DevicePath \"\"" Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.459199 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nmgpv"] Sep 30 05:42:14 crc kubenswrapper[4956]: I0930 05:42:14.462471 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nmgpv"] Sep 30 05:42:16 crc kubenswrapper[4956]: I0930 05:42:16.347799 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" path="/var/lib/kubelet/pods/88e6e028-4a4e-4be5-b4c1-f28e1d244cc5/volumes" Sep 30 05:42:17 crc kubenswrapper[4956]: I0930 05:42:17.764616 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:17 crc kubenswrapper[4956]: I0930 05:42:17.764900 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:17 crc kubenswrapper[4956]: I0930 05:42:17.807911 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:18 crc kubenswrapper[4956]: I0930 05:42:18.238965 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:18 crc kubenswrapper[4956]: I0930 05:42:18.811144 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4wfk"] Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.197370 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k4wfk" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerName="registry-server" containerID="cri-o://6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64" gracePeriod=2 Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.627250 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.770008 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-catalog-content\") pod \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.770128 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vskf9\" (UniqueName: \"kubernetes.io/projected/4ff360ad-893d-4f2a-9d65-3b79c965edbd-kube-api-access-vskf9\") pod \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.770194 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-utilities\") pod \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\" (UID: \"4ff360ad-893d-4f2a-9d65-3b79c965edbd\") " Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.771130 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-utilities" (OuterVolumeSpecName: "utilities") pod "4ff360ad-893d-4f2a-9d65-3b79c965edbd" (UID: "4ff360ad-893d-4f2a-9d65-3b79c965edbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.791414 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff360ad-893d-4f2a-9d65-3b79c965edbd-kube-api-access-vskf9" (OuterVolumeSpecName: "kube-api-access-vskf9") pod "4ff360ad-893d-4f2a-9d65-3b79c965edbd" (UID: "4ff360ad-893d-4f2a-9d65-3b79c965edbd"). InnerVolumeSpecName "kube-api-access-vskf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.827731 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ff360ad-893d-4f2a-9d65-3b79c965edbd" (UID: "4ff360ad-893d-4f2a-9d65-3b79c965edbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.872004 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.872036 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vskf9\" (UniqueName: \"kubernetes.io/projected/4ff360ad-893d-4f2a-9d65-3b79c965edbd-kube-api-access-vskf9\") on node \"crc\" DevicePath \"\"" Sep 30 05:42:20 crc kubenswrapper[4956]: I0930 05:42:20.872049 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff360ad-893d-4f2a-9d65-3b79c965edbd-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.204930 4956 generic.go:334] "Generic (PLEG): container finished" podID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerID="6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64" exitCode=0 Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.205007 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4wfk" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.205024 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4wfk" event={"ID":"4ff360ad-893d-4f2a-9d65-3b79c965edbd","Type":"ContainerDied","Data":"6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64"} Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.205398 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4wfk" event={"ID":"4ff360ad-893d-4f2a-9d65-3b79c965edbd","Type":"ContainerDied","Data":"deb44183df6cfd20d49dd5745630938c28d595bb60303ee8872ea75e41213dab"} Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.205418 4956 scope.go:117] "RemoveContainer" containerID="6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.227418 4956 scope.go:117] "RemoveContainer" containerID="dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.252187 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4wfk"] Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.255093 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k4wfk"] Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.269257 4956 scope.go:117] "RemoveContainer" containerID="8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.291661 4956 scope.go:117] "RemoveContainer" containerID="6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64" Sep 30 05:42:21 crc kubenswrapper[4956]: E0930 05:42:21.292304 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64\": container with ID starting with 6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64 not found: ID does not exist" containerID="6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.292345 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64"} err="failed to get container status \"6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64\": rpc error: code = NotFound desc = could not find container \"6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64\": container with ID starting with 6b05611ca111146323df8ff822960457dfaa62308cdfa8fc54543f9127db2e64 not found: ID does not exist" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.292376 4956 scope.go:117] "RemoveContainer" containerID="dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8" Sep 30 05:42:21 crc kubenswrapper[4956]: E0930 05:42:21.292660 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8\": container with ID starting with dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8 not found: ID does not exist" containerID="dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.292700 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8"} err="failed to get container status \"dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8\": rpc error: code = NotFound desc = could not find container \"dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8\": container with ID starting with dd1cbd1b5ab5279443025656ac8bcf9387f2e08057f7ca5da044d00a92cd13a8 not found: ID does not exist" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.292725 4956 scope.go:117] "RemoveContainer" containerID="8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562" Sep 30 05:42:21 crc kubenswrapper[4956]: E0930 05:42:21.293145 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562\": container with ID starting with 8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562 not found: ID does not exist" containerID="8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562" Sep 30 05:42:21 crc kubenswrapper[4956]: I0930 05:42:21.293175 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562"} err="failed to get container status \"8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562\": rpc error: code = NotFound desc = could not find container \"8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562\": container with ID starting with 8941b95c816f2ac1bee60f18e13c69e1c56824e54db0628d39893fb7cc632562 not found: ID does not exist" Sep 30 05:42:22 crc kubenswrapper[4956]: I0930 05:42:22.348516 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" path="/var/lib/kubelet/pods/4ff360ad-893d-4f2a-9d65-3b79c965edbd/volumes" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.619645 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k"] Sep 30 05:42:24 crc kubenswrapper[4956]: E0930 05:42:24.620266 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerName="extract-utilities" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.620280 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerName="extract-utilities" Sep 30 05:42:24 crc kubenswrapper[4956]: E0930 05:42:24.620294 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerName="registry-server" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.620301 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerName="registry-server" Sep 30 05:42:24 crc kubenswrapper[4956]: E0930 05:42:24.620322 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerName="extract-content" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.620329 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerName="extract-content" Sep 30 05:42:24 crc kubenswrapper[4956]: E0930 05:42:24.620347 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerName="registry-server" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.620354 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerName="registry-server" Sep 30 05:42:24 crc kubenswrapper[4956]: E0930 05:42:24.620366 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerName="extract-utilities" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.620372 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerName="extract-utilities" Sep 30 05:42:24 crc kubenswrapper[4956]: E0930 05:42:24.620384 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerName="extract-content" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.620390 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerName="extract-content" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.620523 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff360ad-893d-4f2a-9d65-3b79c965edbd" containerName="registry-server" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.620717 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e6e028-4a4e-4be5-b4c1-f28e1d244cc5" containerName="registry-server" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.621470 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.623748 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mh2zl" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.640083 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.649295 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.650693 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.653575 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-w2tj4" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.655375 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.656410 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.658142 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-t64qz" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.667594 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.673682 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.678094 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.679030 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.681958 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rh4c6" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.699216 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.733135 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74plp\" (UniqueName: \"kubernetes.io/projected/e8df7824-e9a8-4794-bb91-411ae6639639-kube-api-access-74plp\") pod \"barbican-operator-controller-manager-f7f98cb69-mrz5k\" (UID: \"e8df7824-e9a8-4794-bb91-411ae6639639\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.746089 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.754327 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.778599 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4ft4n" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.778740 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.804517 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.806010 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.808919 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.812642 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qgp8r" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.813394 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.817892 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.818078 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-b7tn7" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.835249 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2tf\" (UniqueName: \"kubernetes.io/projected/8ce74c21-dde5-40bb-8c42-96e4165b8541-kube-api-access-qg2tf\") pod \"glance-operator-controller-manager-8bc4775b5-kvfmc\" (UID: \"8ce74c21-dde5-40bb-8c42-96e4165b8541\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.835296 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbx7p\" (UniqueName: \"kubernetes.io/projected/ffab19aa-8b8f-4067-b19c-3ccd9352cb12-kube-api-access-zbx7p\") pod \"designate-operator-controller-manager-77fb7bcf5b-nghm2\" (UID: \"ffab19aa-8b8f-4067-b19c-3ccd9352cb12\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.835337 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74plp\" (UniqueName: \"kubernetes.io/projected/e8df7824-e9a8-4794-bb91-411ae6639639-kube-api-access-74plp\") pod \"barbican-operator-controller-manager-f7f98cb69-mrz5k\" (UID: \"e8df7824-e9a8-4794-bb91-411ae6639639\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.835361 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fdb\" (UniqueName: \"kubernetes.io/projected/3c1366b7-aa64-4089-a853-e2027658e237-kube-api-access-f6fdb\") pod \"cinder-operator-controller-manager-859cd486d-hhv4p\" (UID: \"3c1366b7-aa64-4089-a853-e2027658e237\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.845324 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.867375 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.894816 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74plp\" (UniqueName: \"kubernetes.io/projected/e8df7824-e9a8-4794-bb91-411ae6639639-kube-api-access-74plp\") pod \"barbican-operator-controller-manager-f7f98cb69-mrz5k\" (UID: \"e8df7824-e9a8-4794-bb91-411ae6639639\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.896360 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.897442 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.908102 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-f7tpw" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.908264 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.909257 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.916018 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rp5n2" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.922504 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.934179 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.939739 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.940625 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2cfx\" (UniqueName: \"kubernetes.io/projected/c1571b7d-f7d4-470d-90ac-d276a39ea2b1-kube-api-access-d2cfx\") pod \"heat-operator-controller-manager-5b4fc86755-mklbg\" (UID: \"c1571b7d-f7d4-470d-90ac-d276a39ea2b1\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.940767 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fdb\" (UniqueName: \"kubernetes.io/projected/3c1366b7-aa64-4089-a853-e2027658e237-kube-api-access-f6fdb\") pod \"cinder-operator-controller-manager-859cd486d-hhv4p\" (UID: \"3c1366b7-aa64-4089-a853-e2027658e237\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.940849 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89ba53cc-155b-485b-926c-83eaa0772764-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g85xg\" (UID: \"89ba53cc-155b-485b-926c-83eaa0772764\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.940934 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-965hs\" (UniqueName: \"kubernetes.io/projected/89ba53cc-155b-485b-926c-83eaa0772764-kube-api-access-965hs\") pod \"infra-operator-controller-manager-7d9c7d9477-g85xg\" (UID: \"89ba53cc-155b-485b-926c-83eaa0772764\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.941000 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g6vg\" (UniqueName: \"kubernetes.io/projected/8c84e3e7-f42f-46df-af16-516bc2cac4a0-kube-api-access-4g6vg\") pod \"horizon-operator-controller-manager-679b4759bb-ncr6h\" (UID: \"8c84e3e7-f42f-46df-af16-516bc2cac4a0\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.941088 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2tf\" (UniqueName: \"kubernetes.io/projected/8ce74c21-dde5-40bb-8c42-96e4165b8541-kube-api-access-qg2tf\") pod \"glance-operator-controller-manager-8bc4775b5-kvfmc\" (UID: \"8ce74c21-dde5-40bb-8c42-96e4165b8541\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.941195 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbx7p\" (UniqueName: \"kubernetes.io/projected/ffab19aa-8b8f-4067-b19c-3ccd9352cb12-kube-api-access-zbx7p\") pod \"designate-operator-controller-manager-77fb7bcf5b-nghm2\" (UID: \"ffab19aa-8b8f-4067-b19c-3ccd9352cb12\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.952163 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.953418 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.955555 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.956712 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.960499 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.975794 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-xtxrq" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.976040 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gt4qz" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.976545 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.997890 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8"] Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.998658 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2tf\" (UniqueName: \"kubernetes.io/projected/8ce74c21-dde5-40bb-8c42-96e4165b8541-kube-api-access-qg2tf\") pod \"glance-operator-controller-manager-8bc4775b5-kvfmc\" (UID: \"8ce74c21-dde5-40bb-8c42-96e4165b8541\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" Sep 30 05:42:24 crc kubenswrapper[4956]: I0930 05:42:24.999091 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.006034 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mr6zq" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.006642 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.007954 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.011463 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-sqfvl" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.012224 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbx7p\" (UniqueName: \"kubernetes.io/projected/ffab19aa-8b8f-4067-b19c-3ccd9352cb12-kube-api-access-zbx7p\") pod \"designate-operator-controller-manager-77fb7bcf5b-nghm2\" (UID: \"ffab19aa-8b8f-4067-b19c-3ccd9352cb12\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.015673 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.023288 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.026794 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.027759 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.029612 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6bfrn" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.031643 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.034824 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fdb\" (UniqueName: \"kubernetes.io/projected/3c1366b7-aa64-4089-a853-e2027658e237-kube-api-access-f6fdb\") pod \"cinder-operator-controller-manager-859cd486d-hhv4p\" (UID: \"3c1366b7-aa64-4089-a853-e2027658e237\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.036166 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.040811 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.041631 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.043481 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89ba53cc-155b-485b-926c-83eaa0772764-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g85xg\" (UID: \"89ba53cc-155b-485b-926c-83eaa0772764\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.043516 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z758t\" (UniqueName: \"kubernetes.io/projected/6ee14caa-a939-467a-bdbb-4160d336eaee-kube-api-access-z758t\") pod \"ironic-operator-controller-manager-6f589bc7f7-jvf6n\" (UID: \"6ee14caa-a939-467a-bdbb-4160d336eaee\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.043535 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-965hs\" (UniqueName: \"kubernetes.io/projected/89ba53cc-155b-485b-926c-83eaa0772764-kube-api-access-965hs\") pod \"infra-operator-controller-manager-7d9c7d9477-g85xg\" (UID: \"89ba53cc-155b-485b-926c-83eaa0772764\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.043553 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g6vg\" (UniqueName: \"kubernetes.io/projected/8c84e3e7-f42f-46df-af16-516bc2cac4a0-kube-api-access-4g6vg\") pod \"horizon-operator-controller-manager-679b4759bb-ncr6h\" (UID: \"8c84e3e7-f42f-46df-af16-516bc2cac4a0\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.043608 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2cfx\" (UniqueName: \"kubernetes.io/projected/c1571b7d-f7d4-470d-90ac-d276a39ea2b1-kube-api-access-d2cfx\") pod \"heat-operator-controller-manager-5b4fc86755-mklbg\" (UID: \"c1571b7d-f7d4-470d-90ac-d276a39ea2b1\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.043645 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p2br\" (UniqueName: \"kubernetes.io/projected/101087b5-cd1e-40f3-916f-5e8f5354ac2d-kube-api-access-7p2br\") pod \"keystone-operator-controller-manager-59d7dc95cf-6hssf\" (UID: \"101087b5-cd1e-40f3-916f-5e8f5354ac2d\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" Sep 30 05:42:25 crc kubenswrapper[4956]: E0930 05:42:25.043744 4956 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 05:42:25 crc kubenswrapper[4956]: E0930 05:42:25.043781 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89ba53cc-155b-485b-926c-83eaa0772764-cert podName:89ba53cc-155b-485b-926c-83eaa0772764 nodeName:}" failed. No retries permitted until 2025-09-30 05:42:25.543766184 +0000 UTC m=+815.870886709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89ba53cc-155b-485b-926c-83eaa0772764-cert") pod "infra-operator-controller-manager-7d9c7d9477-g85xg" (UID: "89ba53cc-155b-485b-926c-83eaa0772764") : secret "infra-operator-webhook-server-cert" not found Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.048963 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-djsmw" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.049194 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.054551 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-v888v"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.055626 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.058344 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fgzjm" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.065835 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-v888v"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.068465 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-965hs\" (UniqueName: \"kubernetes.io/projected/89ba53cc-155b-485b-926c-83eaa0772764-kube-api-access-965hs\") pod \"infra-operator-controller-manager-7d9c7d9477-g85xg\" (UID: \"89ba53cc-155b-485b-926c-83eaa0772764\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.068881 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.076496 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.077600 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.078413 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g6vg\" (UniqueName: \"kubernetes.io/projected/8c84e3e7-f42f-46df-af16-516bc2cac4a0-kube-api-access-4g6vg\") pod \"horizon-operator-controller-manager-679b4759bb-ncr6h\" (UID: \"8c84e3e7-f42f-46df-af16-516bc2cac4a0\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.080415 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.082682 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-lmmmr" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.084133 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2cfx\" (UniqueName: \"kubernetes.io/projected/c1571b7d-f7d4-470d-90ac-d276a39ea2b1-kube-api-access-d2cfx\") pod \"heat-operator-controller-manager-5b4fc86755-mklbg\" (UID: \"c1571b7d-f7d4-470d-90ac-d276a39ea2b1\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.085079 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.087784 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.089421 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-pxpsm" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.094588 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.097965 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.121087 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.122157 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.122225 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.135275 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148090 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6d6j\" (UniqueName: \"kubernetes.io/projected/124abac1-4adc-4a56-8d2b-241e0eb4bf57-kube-api-access-l6d6j\") pod \"manila-operator-controller-manager-b7cf8cb5f-m8xs8\" (UID: \"124abac1-4adc-4a56-8d2b-241e0eb4bf57\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148150 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6pgg\" (UniqueName: \"kubernetes.io/projected/c07dffb1-ebd0-44e9-8061-ce680870aba3-kube-api-access-p6pgg\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-2rcv8\" (UID: \"c07dffb1-ebd0-44e9-8061-ce680870aba3\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148191 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6npn\" (UniqueName: \"kubernetes.io/projected/1ed82f79-3f95-4293-937a-f5d82ce37f10-kube-api-access-r6npn\") pod \"mariadb-operator-controller-manager-67bf5bb885-nhj4d\" (UID: \"1ed82f79-3f95-4293-937a-f5d82ce37f10\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148235 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d99b\" (UniqueName: \"kubernetes.io/projected/ca55e873-96fb-4348-ba98-58ab9648de78-kube-api-access-9d99b\") pod \"neutron-operator-controller-manager-6b96467f46-5x2c8\" (UID: \"ca55e873-96fb-4348-ba98-58ab9648de78\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148280 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p2br\" (UniqueName: \"kubernetes.io/projected/101087b5-cd1e-40f3-916f-5e8f5354ac2d-kube-api-access-7p2br\") pod \"keystone-operator-controller-manager-59d7dc95cf-6hssf\" (UID: \"101087b5-cd1e-40f3-916f-5e8f5354ac2d\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148337 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bndhz\" (UniqueName: \"kubernetes.io/projected/0b6d8a4b-faca-4779-be46-219d3c0a3e22-kube-api-access-bndhz\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82\" (UID: \"0b6d8a4b-faca-4779-be46-219d3c0a3e22\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148383 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z758t\" (UniqueName: \"kubernetes.io/projected/6ee14caa-a939-467a-bdbb-4160d336eaee-kube-api-access-z758t\") pod \"ironic-operator-controller-manager-6f589bc7f7-jvf6n\" (UID: \"6ee14caa-a939-467a-bdbb-4160d336eaee\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148561 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b6d8a4b-faca-4779-be46-219d3c0a3e22-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82\" (UID: \"0b6d8a4b-faca-4779-be46-219d3c0a3e22\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.148725 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn6hl\" (UniqueName: \"kubernetes.io/projected/cbc3fab0-8876-49a7-a85f-4844e253595f-kube-api-access-gn6hl\") pod \"nova-operator-controller-manager-79f9fc9fd8-j79gf\" (UID: \"cbc3fab0-8876-49a7-a85f-4844e253595f\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.170189 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.171912 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.177955 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dpg9f" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.181369 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lx8d2" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.201294 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.219781 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z758t\" (UniqueName: \"kubernetes.io/projected/6ee14caa-a939-467a-bdbb-4160d336eaee-kube-api-access-z758t\") pod \"ironic-operator-controller-manager-6f589bc7f7-jvf6n\" (UID: \"6ee14caa-a939-467a-bdbb-4160d336eaee\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.219880 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p2br\" (UniqueName: \"kubernetes.io/projected/101087b5-cd1e-40f3-916f-5e8f5354ac2d-kube-api-access-7p2br\") pod \"keystone-operator-controller-manager-59d7dc95cf-6hssf\" (UID: \"101087b5-cd1e-40f3-916f-5e8f5354ac2d\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.232654 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.233862 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.235679 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-bkqt9" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.242202 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.243991 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.251784 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d99b\" (UniqueName: \"kubernetes.io/projected/ca55e873-96fb-4348-ba98-58ab9648de78-kube-api-access-9d99b\") pod \"neutron-operator-controller-manager-6b96467f46-5x2c8\" (UID: \"ca55e873-96fb-4348-ba98-58ab9648de78\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.251847 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bndhz\" (UniqueName: \"kubernetes.io/projected/0b6d8a4b-faca-4779-be46-219d3c0a3e22-kube-api-access-bndhz\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82\" (UID: \"0b6d8a4b-faca-4779-be46-219d3c0a3e22\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.251885 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msks7\" (UniqueName: \"kubernetes.io/projected/f93913e6-5d74-4030-ac26-a10781a72db0-kube-api-access-msks7\") pod \"swift-operator-controller-manager-657c6b68c7-6hgwr\" (UID: \"f93913e6-5d74-4030-ac26-a10781a72db0\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.251907 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbml\" (UniqueName: \"kubernetes.io/projected/69909a1b-9121-45ae-aaeb-e63950300ec9-kube-api-access-xkbml\") pod \"placement-operator-controller-manager-598c4c8547-xh7wx\" (UID: \"69909a1b-9121-45ae-aaeb-e63950300ec9\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.251927 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b6d8a4b-faca-4779-be46-219d3c0a3e22-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82\" (UID: \"0b6d8a4b-faca-4779-be46-219d3c0a3e22\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.251960 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6hl\" (UniqueName: \"kubernetes.io/projected/cbc3fab0-8876-49a7-a85f-4844e253595f-kube-api-access-gn6hl\") pod \"nova-operator-controller-manager-79f9fc9fd8-j79gf\" (UID: \"cbc3fab0-8876-49a7-a85f-4844e253595f\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.251999 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2f4k\" (UniqueName: \"kubernetes.io/projected/c0c96af5-0c02-4dfd-91e5-947696cb4899-kube-api-access-m2f4k\") pod \"ovn-operator-controller-manager-84c745747f-v888v\" (UID: \"c0c96af5-0c02-4dfd-91e5-947696cb4899\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.252018 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6d6j\" (UniqueName: \"kubernetes.io/projected/124abac1-4adc-4a56-8d2b-241e0eb4bf57-kube-api-access-l6d6j\") pod \"manila-operator-controller-manager-b7cf8cb5f-m8xs8\" (UID: \"124abac1-4adc-4a56-8d2b-241e0eb4bf57\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.252034 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpn42\" (UniqueName: \"kubernetes.io/projected/9cdfca4b-0805-4ebc-92e1-906044d82e4b-kube-api-access-gpn42\") pod \"telemetry-operator-controller-manager-cb66d6b59-wxgr6\" (UID: \"9cdfca4b-0805-4ebc-92e1-906044d82e4b\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.252052 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6pgg\" (UniqueName: \"kubernetes.io/projected/c07dffb1-ebd0-44e9-8061-ce680870aba3-kube-api-access-p6pgg\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-2rcv8\" (UID: \"c07dffb1-ebd0-44e9-8061-ce680870aba3\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.252072 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6npn\" (UniqueName: \"kubernetes.io/projected/1ed82f79-3f95-4293-937a-f5d82ce37f10-kube-api-access-r6npn\") pod \"mariadb-operator-controller-manager-67bf5bb885-nhj4d\" (UID: \"1ed82f79-3f95-4293-937a-f5d82ce37f10\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" Sep 30 05:42:25 crc kubenswrapper[4956]: E0930 05:42:25.252341 4956 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 05:42:25 crc kubenswrapper[4956]: E0930 05:42:25.252417 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b6d8a4b-faca-4779-be46-219d3c0a3e22-cert podName:0b6d8a4b-faca-4779-be46-219d3c0a3e22 nodeName:}" failed. No retries permitted until 2025-09-30 05:42:25.752395742 +0000 UTC m=+816.079516357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0b6d8a4b-faca-4779-be46-219d3c0a3e22-cert") pod "openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" (UID: "0b6d8a4b-faca-4779-be46-219d3c0a3e22") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.266719 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.278043 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.287574 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6npn\" (UniqueName: \"kubernetes.io/projected/1ed82f79-3f95-4293-937a-f5d82ce37f10-kube-api-access-r6npn\") pod \"mariadb-operator-controller-manager-67bf5bb885-nhj4d\" (UID: \"1ed82f79-3f95-4293-937a-f5d82ce37f10\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.290956 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.297710 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6hl\" (UniqueName: \"kubernetes.io/projected/cbc3fab0-8876-49a7-a85f-4844e253595f-kube-api-access-gn6hl\") pod \"nova-operator-controller-manager-79f9fc9fd8-j79gf\" (UID: \"cbc3fab0-8876-49a7-a85f-4844e253595f\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.298542 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6d6j\" (UniqueName: \"kubernetes.io/projected/124abac1-4adc-4a56-8d2b-241e0eb4bf57-kube-api-access-l6d6j\") pod \"manila-operator-controller-manager-b7cf8cb5f-m8xs8\" (UID: \"124abac1-4adc-4a56-8d2b-241e0eb4bf57\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.300904 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6pgg\" (UniqueName: \"kubernetes.io/projected/c07dffb1-ebd0-44e9-8061-ce680870aba3-kube-api-access-p6pgg\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-2rcv8\" (UID: \"c07dffb1-ebd0-44e9-8061-ce680870aba3\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.300983 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bndhz\" (UniqueName: \"kubernetes.io/projected/0b6d8a4b-faca-4779-be46-219d3c0a3e22-kube-api-access-bndhz\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82\" (UID: \"0b6d8a4b-faca-4779-be46-219d3c0a3e22\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.302658 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d99b\" (UniqueName: \"kubernetes.io/projected/ca55e873-96fb-4348-ba98-58ab9648de78-kube-api-access-9d99b\") pod \"neutron-operator-controller-manager-6b96467f46-5x2c8\" (UID: \"ca55e873-96fb-4348-ba98-58ab9648de78\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.353633 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msks7\" (UniqueName: \"kubernetes.io/projected/f93913e6-5d74-4030-ac26-a10781a72db0-kube-api-access-msks7\") pod \"swift-operator-controller-manager-657c6b68c7-6hgwr\" (UID: \"f93913e6-5d74-4030-ac26-a10781a72db0\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.353689 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbml\" (UniqueName: \"kubernetes.io/projected/69909a1b-9121-45ae-aaeb-e63950300ec9-kube-api-access-xkbml\") pod \"placement-operator-controller-manager-598c4c8547-xh7wx\" (UID: \"69909a1b-9121-45ae-aaeb-e63950300ec9\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.353744 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkxm2\" (UniqueName: \"kubernetes.io/projected/f620cf06-9ba1-4866-9964-dc38e574c889-kube-api-access-jkxm2\") pod \"watcher-operator-controller-manager-75756dd4d9-sbn5x\" (UID: \"f620cf06-9ba1-4866-9964-dc38e574c889\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.353774 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmxz\" (UniqueName: \"kubernetes.io/projected/2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f-kube-api-access-6lmxz\") pod \"test-operator-controller-manager-6bb97fcf96-jz8bg\" (UID: \"2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.353841 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2f4k\" (UniqueName: \"kubernetes.io/projected/c0c96af5-0c02-4dfd-91e5-947696cb4899-kube-api-access-m2f4k\") pod \"ovn-operator-controller-manager-84c745747f-v888v\" (UID: \"c0c96af5-0c02-4dfd-91e5-947696cb4899\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.353860 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpn42\" (UniqueName: \"kubernetes.io/projected/9cdfca4b-0805-4ebc-92e1-906044d82e4b-kube-api-access-gpn42\") pod \"telemetry-operator-controller-manager-cb66d6b59-wxgr6\" (UID: \"9cdfca4b-0805-4ebc-92e1-906044d82e4b\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.354430 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.373997 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.387773 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.389441 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msks7\" (UniqueName: \"kubernetes.io/projected/f93913e6-5d74-4030-ac26-a10781a72db0-kube-api-access-msks7\") pod \"swift-operator-controller-manager-657c6b68c7-6hgwr\" (UID: \"f93913e6-5d74-4030-ac26-a10781a72db0\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.391379 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.392503 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.399468 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbml\" (UniqueName: \"kubernetes.io/projected/69909a1b-9121-45ae-aaeb-e63950300ec9-kube-api-access-xkbml\") pod \"placement-operator-controller-manager-598c4c8547-xh7wx\" (UID: \"69909a1b-9121-45ae-aaeb-e63950300ec9\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.401093 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.402005 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.410751 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rq75d" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.414818 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2f4k\" (UniqueName: \"kubernetes.io/projected/c0c96af5-0c02-4dfd-91e5-947696cb4899-kube-api-access-m2f4k\") pod \"ovn-operator-controller-manager-84c745747f-v888v\" (UID: \"c0c96af5-0c02-4dfd-91e5-947696cb4899\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.419868 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpn42\" (UniqueName: \"kubernetes.io/projected/9cdfca4b-0805-4ebc-92e1-906044d82e4b-kube-api-access-gpn42\") pod \"telemetry-operator-controller-manager-cb66d6b59-wxgr6\" (UID: \"9cdfca4b-0805-4ebc-92e1-906044d82e4b\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.442583 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.457080 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkxm2\" (UniqueName: \"kubernetes.io/projected/f620cf06-9ba1-4866-9964-dc38e574c889-kube-api-access-jkxm2\") pod \"watcher-operator-controller-manager-75756dd4d9-sbn5x\" (UID: \"f620cf06-9ba1-4866-9964-dc38e574c889\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.457154 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmxz\" (UniqueName: \"kubernetes.io/projected/2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f-kube-api-access-6lmxz\") pod \"test-operator-controller-manager-6bb97fcf96-jz8bg\" (UID: \"2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.459880 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.474216 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.505186 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkxm2\" (UniqueName: \"kubernetes.io/projected/f620cf06-9ba1-4866-9964-dc38e574c889-kube-api-access-jkxm2\") pod \"watcher-operator-controller-manager-75756dd4d9-sbn5x\" (UID: \"f620cf06-9ba1-4866-9964-dc38e574c889\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.515733 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmxz\" (UniqueName: \"kubernetes.io/projected/2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f-kube-api-access-6lmxz\") pod \"test-operator-controller-manager-6bb97fcf96-jz8bg\" (UID: \"2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.536396 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.537272 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.558180 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89ba53cc-155b-485b-926c-83eaa0772764-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g85xg\" (UID: \"89ba53cc-155b-485b-926c-83eaa0772764\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.558244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-7nhwl\" (UID: \"2e369192-5374-4d18-954d-7d46ff60e9c1\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.558389 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kthfx\" (UniqueName: \"kubernetes.io/projected/2e369192-5374-4d18-954d-7d46ff60e9c1-kube-api-access-kthfx\") pod \"openstack-operator-controller-manager-7b7bb8bd67-7nhwl\" (UID: \"2e369192-5374-4d18-954d-7d46ff60e9c1\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.566302 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-chtzh" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.572672 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj"] Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.572929 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.576689 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.581833 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89ba53cc-155b-485b-926c-83eaa0772764-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-g85xg\" (UID: \"89ba53cc-155b-485b-926c-83eaa0772764\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.593460 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.609983 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.613305 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.661355 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kthfx\" (UniqueName: \"kubernetes.io/projected/2e369192-5374-4d18-954d-7d46ff60e9c1-kube-api-access-kthfx\") pod \"openstack-operator-controller-manager-7b7bb8bd67-7nhwl\" (UID: \"2e369192-5374-4d18-954d-7d46ff60e9c1\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.661492 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvcwn\" (UniqueName: \"kubernetes.io/projected/49d67534-20e0-48be-9614-eec49889c4a7-kube-api-access-cvcwn\") pod \"rabbitmq-cluster-operator-manager-79d8469568-j2hpj\" (UID: \"49d67534-20e0-48be-9614-eec49889c4a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.661570 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-7nhwl\" (UID: \"2e369192-5374-4d18-954d-7d46ff60e9c1\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.661649 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k"] Sep 30 05:42:25 crc kubenswrapper[4956]: E0930 05:42:25.662098 4956 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 05:42:25 crc kubenswrapper[4956]: E0930 05:42:25.662862 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert podName:2e369192-5374-4d18-954d-7d46ff60e9c1 nodeName:}" failed. No retries permitted until 2025-09-30 05:42:26.162733309 +0000 UTC m=+816.489853834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert") pod "openstack-operator-controller-manager-7b7bb8bd67-7nhwl" (UID: "2e369192-5374-4d18-954d-7d46ff60e9c1") : secret "webhook-server-cert" not found Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.720030 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kthfx\" (UniqueName: \"kubernetes.io/projected/2e369192-5374-4d18-954d-7d46ff60e9c1-kube-api-access-kthfx\") pod \"openstack-operator-controller-manager-7b7bb8bd67-7nhwl\" (UID: \"2e369192-5374-4d18-954d-7d46ff60e9c1\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.752050 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.763999 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvcwn\" (UniqueName: \"kubernetes.io/projected/49d67534-20e0-48be-9614-eec49889c4a7-kube-api-access-cvcwn\") pod \"rabbitmq-cluster-operator-manager-79d8469568-j2hpj\" (UID: \"49d67534-20e0-48be-9614-eec49889c4a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.764451 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b6d8a4b-faca-4779-be46-219d3c0a3e22-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82\" (UID: \"0b6d8a4b-faca-4779-be46-219d3c0a3e22\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.768662 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b6d8a4b-faca-4779-be46-219d3c0a3e22-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82\" (UID: \"0b6d8a4b-faca-4779-be46-219d3c0a3e22\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.794799 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvcwn\" (UniqueName: \"kubernetes.io/projected/49d67534-20e0-48be-9614-eec49889c4a7-kube-api-access-cvcwn\") pod \"rabbitmq-cluster-operator-manager-79d8469568-j2hpj\" (UID: \"49d67534-20e0-48be-9614-eec49889c4a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.902417 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj" Sep 30 05:42:25 crc kubenswrapper[4956]: I0930 05:42:25.906062 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc"] Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.063370 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.170386 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-7nhwl\" (UID: \"2e369192-5374-4d18-954d-7d46ff60e9c1\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:26 crc kubenswrapper[4956]: E0930 05:42:26.170735 4956 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 05:42:26 crc kubenswrapper[4956]: E0930 05:42:26.170780 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert podName:2e369192-5374-4d18-954d-7d46ff60e9c1 nodeName:}" failed. No retries permitted until 2025-09-30 05:42:27.170766676 +0000 UTC m=+817.497887201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert") pod "openstack-operator-controller-manager-7b7bb8bd67-7nhwl" (UID: "2e369192-5374-4d18-954d-7d46ff60e9c1") : secret "webhook-server-cert" not found Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.244598 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" event={"ID":"8ce74c21-dde5-40bb-8c42-96e4165b8541","Type":"ContainerStarted","Data":"8045c80843a8d3dd9e36f09c665477df8f93ba90c0c7b877313a864b41e08219"} Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.246340 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" event={"ID":"e8df7824-e9a8-4794-bb91-411ae6639639","Type":"ContainerStarted","Data":"d8c2ad977b731709ac9b6578778c38d3ae8e1749b91066ca62f675afcf43c790"} Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.550866 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf"] Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.560510 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h"] Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.566071 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg"] Sep 30 05:42:26 crc kubenswrapper[4956]: W0930 05:42:26.567945 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod101087b5_cd1e_40f3_916f_5e8f5354ac2d.slice/crio-6eb71a27d5eb4864ede587a880271d2313f02ac2549babbbd0e8da90c8303040 WatchSource:0}: Error finding container 6eb71a27d5eb4864ede587a880271d2313f02ac2549babbbd0e8da90c8303040: Status 404 returned error can't find the container with id 6eb71a27d5eb4864ede587a880271d2313f02ac2549babbbd0e8da90c8303040 Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.579969 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p"] Sep 30 05:42:26 crc kubenswrapper[4956]: W0930 05:42:26.637274 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c1366b7_aa64_4089_a853_e2027658e237.slice/crio-eb6371e02e8028bda3e84eebc43ca80d8db336b7dcfbab7b12864d5414675a41 WatchSource:0}: Error finding container eb6371e02e8028bda3e84eebc43ca80d8db336b7dcfbab7b12864d5414675a41: Status 404 returned error can't find the container with id eb6371e02e8028bda3e84eebc43ca80d8db336b7dcfbab7b12864d5414675a41 Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.930772 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n"] Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.980319 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6"] Sep 30 05:42:26 crc kubenswrapper[4956]: I0930 05:42:26.987526 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2"] Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:26.997195 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8"] Sep 30 05:42:27 crc kubenswrapper[4956]: W0930 05:42:27.003778 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c96af5_0c02_4dfd_91e5_947696cb4899.slice/crio-55ff16bb5c22d04387b4a33300f218ebf29f82c5a282915eaca5ba6653c9538f WatchSource:0}: Error finding container 55ff16bb5c22d04387b4a33300f218ebf29f82c5a282915eaca5ba6653c9538f: Status 404 returned error can't find the container with id 55ff16bb5c22d04387b4a33300f218ebf29f82c5a282915eaca5ba6653c9538f Sep 30 05:42:27 crc kubenswrapper[4956]: W0930 05:42:27.003913 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69909a1b_9121_45ae_aaeb_e63950300ec9.slice/crio-c577a09a16d1ca6339cf0661611055bed1c4dc5c93f646ca55ba25955f8048ca WatchSource:0}: Error finding container c577a09a16d1ca6339cf0661611055bed1c4dc5c93f646ca55ba25955f8048ca: Status 404 returned error can't find the container with id c577a09a16d1ca6339cf0661611055bed1c4dc5c93f646ca55ba25955f8048ca Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.016473 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8"] Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.023380 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx"] Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.032082 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg"] Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.045280 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg"] Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.056499 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gpn42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-cb66d6b59-wxgr6_openstack-operators(9cdfca4b-0805-4ebc-92e1-906044d82e4b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.058460 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-v888v"] Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.064452 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6pgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6fb7d6b8bf-2rcv8_openstack-operators(c07dffb1-ebd0-44e9-8061-ce680870aba3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.067196 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8"] Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.185333 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" podUID="9cdfca4b-0805-4ebc-92e1-906044d82e4b" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.188299 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-7nhwl\" (UID: \"2e369192-5374-4d18-954d-7d46ff60e9c1\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.189694 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" podUID="c07dffb1-ebd0-44e9-8061-ce680870aba3" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.194437 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e369192-5374-4d18-954d-7d46ff60e9c1-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-7nhwl\" (UID: \"2e369192-5374-4d18-954d-7d46ff60e9c1\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.256937 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d"] Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.262058 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x"] Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.280286 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr"] Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.280334 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj"] Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.281236 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.281508 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" event={"ID":"2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f","Type":"ContainerStarted","Data":"781bfa4ade255a3654c3745d72e7fae6068cf643514fa50945123e33a8993d24"} Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.283224 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gn6hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79f9fc9fd8-j79gf_openstack-operators(cbc3fab0-8876-49a7-a85f-4844e253595f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.283927 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jkxm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75756dd4d9-sbn5x_openstack-operators(f620cf06-9ba1-4866-9964-dc38e574c889): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.285282 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r6npn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf5bb885-nhj4d_openstack-operators(1ed82f79-3f95-4293-937a-f5d82ce37f10): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.286402 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf"] Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.286484 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msks7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-657c6b68c7-6hgwr_openstack-operators(f93913e6-5d74-4030-ac26-a10781a72db0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.291378 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82"] Sep 30 05:42:27 crc kubenswrapper[4956]: W0930 05:42:27.306340 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b6d8a4b_faca_4779_be46_219d3c0a3e22.slice/crio-1ef0e7216a42d86a00dff484fdeb3ba84479f67a4b53bbf67be7ffd3a3e590da WatchSource:0}: Error finding container 1ef0e7216a42d86a00dff484fdeb3ba84479f67a4b53bbf67be7ffd3a3e590da: Status 404 returned error can't find the container with id 1ef0e7216a42d86a00dff484fdeb3ba84479f67a4b53bbf67be7ffd3a3e590da Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.307125 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" event={"ID":"69909a1b-9121-45ae-aaeb-e63950300ec9","Type":"ContainerStarted","Data":"c577a09a16d1ca6339cf0661611055bed1c4dc5c93f646ca55ba25955f8048ca"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.310319 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" event={"ID":"ffab19aa-8b8f-4067-b19c-3ccd9352cb12","Type":"ContainerStarted","Data":"3bc73caf4033720509d49784f5d4d63866935f6ec1e06ca6bcfa677dd6e4f841"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.311757 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" event={"ID":"8c84e3e7-f42f-46df-af16-516bc2cac4a0","Type":"ContainerStarted","Data":"6a3d020514b2456e0d7d7e9b085f5bb2c8ec0872c78602964b8adebcef7d067e"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.313577 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" event={"ID":"124abac1-4adc-4a56-8d2b-241e0eb4bf57","Type":"ContainerStarted","Data":"0c3ef4164ebbe0074ab70a45a50f227cb92892fa23ee0b2b05627b4d57b5a486"} Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.314009 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:89f9e06c633ae852be8d3e3ca581def0a6e9a5b38c0d519f656976c7414b6b97,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:56f155abc1b8734e4a79c7306ba38caf8d2881625f37d2f9c5a5763fa4db7e02,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:29c8cd4f2d853f512e2ecd44f522f28c3aac046a72733365aa5e91667041d62e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:ed896681f0d9720f56bbcb0b7a4f3626ed397e89af919604ca68b42b7b598859,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:712e1c932a90ef5e3c3ee5d5aea591a377da8c4af604ebd8ec399869a61dfbef,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:10fd8489a5bf6f1d781e9226de68356132db78b62269e69d632748cb08fae725,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:73fd28af83ea96cc920d26dba6105ee59f0824234527949884e6ca55b71d7533,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:8b3a90516ba0695cf3198a7b101da770c30c8100cb79f8088b5729e6a50ddd6d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:6d42bcf65422d2de9cd807feb3e8b005de10084b4b8eb340c8a9045644ae7aaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:32a25ac44706b73bff04a89514177b1efd675f0442b295e225f0020555ca6350,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:b19043eac7c653e00da8da9418ae378fdd29698adb1adb4bf5ae7cfc03ba5538,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:c486e00b36ea7698d6a4cd9048a759bad5a8286e4949bbd1f82c3ddb70600b9b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:ef2727f0300fbf3bf15d8ddc409d0fd63e4aac9dd64c86459bd6ff64fc6b9534,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:329aac65ba00c3cf43bb1d5fac8818752f01de90b47719e2a84db4e2fe083292,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:6ce73885ac1ee7c69468efc448eff5deae46502812c5e3d099f771e1cc03345f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:282cc0fcdbb8a688dd62a2499480aae4a36b620f2160d51e6c8269e6cc32d5fc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:d98c0c9d3bdd84daf4b98d45b8bbe2e67a633491897dda7167664a5fa1f0f26e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:4ad1d36fe1c8992e43910fc2d566b991fd73f9b82b1ab860c66858448ff82c00,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:92789eab1b8a91807a5e898cb63478d125ae539eafe63c96049100c6ddeadb04,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:ee9832268e0df5d62c50c5ce171e9ef72a035aa74c718cfbf482e34426d8d15e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:07b4f96f24f32224c13613f85173f9fcc3092b8797ffa47519403d124bfe4c15,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:3a873c95bcb7ae8bd24ff1eb5fe89ac5272a41a3345a7b41d55419b5d66b70e7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:388dbae2f1aae2720e919cc24d10cd577b73b4e4ef7abdc34287bcb8d27ff98f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:d4c1b2496868da3dcca9f4bda0834fcc58d23c21d8ce3c42a68205d02039c487,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:c4414cc2680fb1bacbf99261f759f4ef7401fb2e4953140270bffdab8e002f22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:b9b950a656f1456b3143872c492b0987bf4a9e23bc7c59d843cf50099667b368,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:afd5d6822b86ea0930b2011fede834bb24495995d7baac03363ab61d89f07a22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:665d7a25dfc959ec5448d5ba6b430792ebde1be1580ea6809e9b3b4f94184b3f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:499c6d82390ee2dbb91628d2e42671406372fb603d697685a04145cf6dd8d0ab,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:da2736bc98bfe340e86234523d4c00220f6f79add271900981cf4ad9f4c5ee51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:4df8dad8a5fb4805a0424cbc0b8df666b9a06b76c64f26e186f3b9e8efe6cd95,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:65c16453b5b7bb113646ffce0be26138e89eecbf6dd1582cdfe76af7f5dc62cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:ce968dce2209ec5114772b4b73ed16c0a25988637372f2afbfac080cc6f1e378,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:b7823eaacf55280cdf3f1bede4f40bf49fdbf9ba9f3f5ba64b0abedede601c8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:605206d967ffaa20156eb07a645654cd3e0f880bb0eefbb2b5e1e749b169f148,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:9470db6caf5102cf37ddb1f137f17b05ef7119f174f4189beb4839ef7f65730c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:34e84da4ae7e5d65931cbefcda84fd8fdc93271ec466adf1a9040b67a3af176a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:b301b17c31e47733a8a232773427ce3cb50433a3aa09d4a5bd998b1aeb5e5530,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:d642c35c0f9d3acf31987c028f1d4d4fdf7b49e1d6cbcd73268c12b3d6e14b86,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:922eb0799ab36a91aa95abe52565dc60db807457dbf8c651b30e06b9e8aebcd4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:cd01e9605ab513458a6813e38d37fbfde1a91388cc5c00962203dbcbdc285e79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:dd35c22b17730cbca8547ea98459f182939462c8dc3465d21335a377018937de,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:0e0e2e48a41d5417f1d6a4407e63d443611b7eacd66e27f561c9eedf3e5a66c5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:735bd24219fdb5f21c31313a5bc685364f45c004fb5e8af634984c147060d4e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:35b5554efae34f2c25a2d274c78bdaecf3d4ce949fa61c692835ee54cdfc6d74,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:01b93ab0d87482b9a1fd46706771974743dea1ca74f5fcc3de4a560f7cfc033b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:87471fbe3ba77b7115096f4fef8f5a9e1468cbd5bf6060c09785a60f9107a717,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:947dcc46173064939cba252d5db34eb6ddd05eb0af7afd762beebe77e9a72c6e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:8498ed720d02ce4e7045f7eb0051b138274cddba9b1e443d11e413da3474d3a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:2cb054830655a6af5fc6848360618676d24fd9cf15078c0b9855e09d05733eec,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:0f5f8f560cd3b4951f7e8e67ef570575435b4c6915658cbb66f32a201776078b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:7055e8d7b7d72ce697c6077be14c525c019d186002f04765b90a14c82e01cc7c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:d2cd7a21461b4b569d93a63d57761f437cf6bd0847d69a3a65f64d400c7cca6d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:432c0c6f36a5e4e4db394771f7dc72f3bf9e5060dc4220f781d3c5050cc17f0d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:3ff379a74cc15352bfa25605dbb1a5f4250620e8364bf87ed2f3d5c17e6a8b26,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:c67a7bba2fc9351c302369b590473a737bab20d0982d227756fe1fa0bc1c8773,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:50c613d159667a26ba4bfb7aebf157b8db8919c815a866438b1d2700231a508e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:f3d3d7a7c83926a09714199406bfe8070e6be5055cbfbf00aa37f47e1e5e9bc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:e9b3260907b0e417bb779a7d513a2639734cbbf792e77c61e05e760d06978f4a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:1aa6a76e67f2d91ee45472741238b5d4ab53f9bcb94db678c7ae92e1af28899d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:80b8547cf5821a4eb5461d1ac14edbc700ef03926268af960bf511647de027af,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content@sha256:7086442096db5ceb68e22bcce00688072957fdad07d00d8f18eb0506ad958923,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:bf42dfd2e225818662aa28c4bb23204dc47b2b91127ca0e49b085baa1ea7609d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:bd08ffdb4dcfd436200d846d15b2bdcc14122fa43adfea4c0980a087a18f9e3e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:2d1e733d24df6ca02636374147f801a0ec1509f8db2f9ad8c739b3f2341815fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:c08ba2a0df4cc18e615b25c329e9c74153709b435c032c38502ec78ba297c5fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:b6cdafc7722def5b63ef4f00251e10aca93ef82628b21e88925c3d4b49277316,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:0a0bbe43e3c266dfeb40a09036f76393dc70377b636724c130a29c434f6d6c82,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:7387b628d7cfb3ff349e0df6f11f41ae7fdb0e2d55844944896af02a81ac7cf7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:9a3671dee1752ebe3639a0b16de95d29e779f1629d563e0585d65b9792542fc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:b2782fe02b1438d68308a5847b0628f0971b5bb8bb0a4d20fe15176fa75bd33f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:7118cc3a695fead2a8bab14c8ace018ed7a5ba23ef347bf4ead44219e8467866,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:793a836e17b07b0e0a4e8d3177fd04724e1e058fca275ef434abe60a2e444a79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:713d74dc81859344bdcae68a9f7a954146c3e68cfa819518a58cce9e896298c8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:e39be536015777a1b0df8ac863f354046b2b15fee8482abd37d2fa59d8074208,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:28e209c66bc86354495ac7793f2e66db0e8540485590742ab1b53a7cf24cb4fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:d117753b6cff563084bf771173ea89a2ce00854efdc45447667e5d230c60c363,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:f1aac0a57d83b085c37cf75ce0a56f85b68353b1a88740b64a5858bc93dba36b,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bndhz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82_openstack-operators(0b6d8a4b-faca-4779-be46-219d3c0a3e22): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.314894 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" event={"ID":"6ee14caa-a939-467a-bdbb-4160d336eaee","Type":"ContainerStarted","Data":"ee393b2478ab58f4e9313ab29f47f730bb4e45e904b638a212eaf40b420c2539"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.316245 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" event={"ID":"c1571b7d-f7d4-470d-90ac-d276a39ea2b1","Type":"ContainerStarted","Data":"39c19ae308112b948066c871dda5fdc99b8be8d4280ba65239a181a74b7397cc"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.319758 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" event={"ID":"3c1366b7-aa64-4089-a853-e2027658e237","Type":"ContainerStarted","Data":"eb6371e02e8028bda3e84eebc43ca80d8db336b7dcfbab7b12864d5414675a41"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.323810 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" event={"ID":"c07dffb1-ebd0-44e9-8061-ce680870aba3","Type":"ContainerStarted","Data":"87c3fca2f96f13b121e41e361ed6628002930169fbc288eecebe8770533d0d28"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.323836 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" event={"ID":"c07dffb1-ebd0-44e9-8061-ce680870aba3","Type":"ContainerStarted","Data":"8041ff8181b6c7697a68832c9aa52cd6b7fa39378d5852d2312de7662d6df9e9"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.325236 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" event={"ID":"ca55e873-96fb-4348-ba98-58ab9648de78","Type":"ContainerStarted","Data":"4440118f2fd77c9d6761e13b73274ef59e2c785a2f84c47f569e66ed19856fda"} Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.326048 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" podUID="c07dffb1-ebd0-44e9-8061-ce680870aba3" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.326817 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" event={"ID":"c0c96af5-0c02-4dfd-91e5-947696cb4899","Type":"ContainerStarted","Data":"55ff16bb5c22d04387b4a33300f218ebf29f82c5a282915eaca5ba6653c9538f"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.328485 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" event={"ID":"101087b5-cd1e-40f3-916f-5e8f5354ac2d","Type":"ContainerStarted","Data":"6eb71a27d5eb4864ede587a880271d2313f02ac2549babbbd0e8da90c8303040"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.330419 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" event={"ID":"89ba53cc-155b-485b-926c-83eaa0772764","Type":"ContainerStarted","Data":"4fb894587abb9deb6cfa7888517a0c677e1378e5f696e1c316b4c82b49e84f3a"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.335924 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" event={"ID":"9cdfca4b-0805-4ebc-92e1-906044d82e4b","Type":"ContainerStarted","Data":"dea4f09c3e88e1942a66096f06996c468049cc3cd506e1f408499ffffcf51587"} Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.335976 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" event={"ID":"9cdfca4b-0805-4ebc-92e1-906044d82e4b","Type":"ContainerStarted","Data":"56fd50741e2f0f327c235761fe38afe4397c9604a9268a4f26f716a205cdd43f"} Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.348302 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" podUID="9cdfca4b-0805-4ebc-92e1-906044d82e4b" Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.705847 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" podUID="0b6d8a4b-faca-4779-be46-219d3c0a3e22" Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.760460 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" podUID="1ed82f79-3f95-4293-937a-f5d82ce37f10" Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.761894 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" podUID="cbc3fab0-8876-49a7-a85f-4844e253595f" Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.837465 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" podUID="f93913e6-5d74-4030-ac26-a10781a72db0" Sep 30 05:42:27 crc kubenswrapper[4956]: E0930 05:42:27.845372 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" podUID="f620cf06-9ba1-4866-9964-dc38e574c889" Sep 30 05:42:27 crc kubenswrapper[4956]: I0930 05:42:27.992458 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl"] Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.409737 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" event={"ID":"f93913e6-5d74-4030-ac26-a10781a72db0","Type":"ContainerStarted","Data":"1fa819de3e2c8ff1e3f38bc82087e34def382f036e2fb601edf36f16609e37b2"} Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.409781 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" event={"ID":"f93913e6-5d74-4030-ac26-a10781a72db0","Type":"ContainerStarted","Data":"e765b8b004689707f75daf446ac96131babe0cf154b08b5a51e82ae3426c52f7"} Sep 30 05:42:28 crc kubenswrapper[4956]: E0930 05:42:28.413376 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" podUID="f93913e6-5d74-4030-ac26-a10781a72db0" Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.418490 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" event={"ID":"cbc3fab0-8876-49a7-a85f-4844e253595f","Type":"ContainerStarted","Data":"1c9fdb3de8236c60e56506af035e9a94fc1b851ecf0ff56538510d952125c850"} Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.418525 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" event={"ID":"cbc3fab0-8876-49a7-a85f-4844e253595f","Type":"ContainerStarted","Data":"6e75b2af9f8bf61ae6c3a0232027d9a960a71736885c8622058a9b0e198ddf89"} Sep 30 05:42:28 crc kubenswrapper[4956]: E0930 05:42:28.419878 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef\\\"\"" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" podUID="cbc3fab0-8876-49a7-a85f-4844e253595f" Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.424535 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" event={"ID":"f620cf06-9ba1-4866-9964-dc38e574c889","Type":"ContainerStarted","Data":"9938cf7f663f24b894e8ba037fb384f852104d117df46ada61f88975df9ef515"} Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.424570 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" event={"ID":"f620cf06-9ba1-4866-9964-dc38e574c889","Type":"ContainerStarted","Data":"bf0b2ec80149e38be0ca39540a22929dbd08a104c2d008d8f40293d6c4e16335"} Sep 30 05:42:28 crc kubenswrapper[4956]: E0930 05:42:28.425931 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" podUID="f620cf06-9ba1-4866-9964-dc38e574c889" Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.426267 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj" event={"ID":"49d67534-20e0-48be-9614-eec49889c4a7","Type":"ContainerStarted","Data":"3d6df31eff431ae246e57513718bfc4f93a7c8b8ef7fd6e1c83be7f34ffde928"} Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.428048 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" event={"ID":"1ed82f79-3f95-4293-937a-f5d82ce37f10","Type":"ContainerStarted","Data":"3cf7c4bbe54714672f0f5e488a52fc3c98350c8f932a1018bd0627e2937c5555"} Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.428069 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" event={"ID":"1ed82f79-3f95-4293-937a-f5d82ce37f10","Type":"ContainerStarted","Data":"fc21732e193cd334197a9b97b05c4cb12aae6630b4dd0b223af6ba3df3660d0c"} Sep 30 05:42:28 crc kubenswrapper[4956]: E0930 05:42:28.434197 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" podUID="1ed82f79-3f95-4293-937a-f5d82ce37f10" Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.441160 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" event={"ID":"0b6d8a4b-faca-4779-be46-219d3c0a3e22","Type":"ContainerStarted","Data":"13f7adffcb7763386e4a97841ef590bbae43969e142e8dd43baf86cd09054877"} Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.441201 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" event={"ID":"0b6d8a4b-faca-4779-be46-219d3c0a3e22","Type":"ContainerStarted","Data":"1ef0e7216a42d86a00dff484fdeb3ba84479f67a4b53bbf67be7ffd3a3e590da"} Sep 30 05:42:28 crc kubenswrapper[4956]: E0930 05:42:28.445638 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" podUID="0b6d8a4b-faca-4779-be46-219d3c0a3e22" Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.452054 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" event={"ID":"2e369192-5374-4d18-954d-7d46ff60e9c1","Type":"ContainerStarted","Data":"37d6a8ac570a21ea88ea17ac9ded16430df000650a13373dc22ed2102615e9cb"} Sep 30 05:42:28 crc kubenswrapper[4956]: I0930 05:42:28.452126 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" event={"ID":"2e369192-5374-4d18-954d-7d46ff60e9c1","Type":"ContainerStarted","Data":"d1324a62fe0cd5ba002f83b147cbc2fcfa86dc086007f6e25fbdc05e20df6ebc"} Sep 30 05:42:28 crc kubenswrapper[4956]: E0930 05:42:28.456929 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" podUID="c07dffb1-ebd0-44e9-8061-ce680870aba3" Sep 30 05:42:28 crc kubenswrapper[4956]: E0930 05:42:28.457229 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" podUID="9cdfca4b-0805-4ebc-92e1-906044d82e4b" Sep 30 05:42:29 crc kubenswrapper[4956]: I0930 05:42:29.463466 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" event={"ID":"2e369192-5374-4d18-954d-7d46ff60e9c1","Type":"ContainerStarted","Data":"0a84a0321e601de9e3f7a936c28b6a39cb63637de1f68d7a2331a46c0c204a2b"} Sep 30 05:42:29 crc kubenswrapper[4956]: E0930 05:42:29.470471 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" podUID="f620cf06-9ba1-4866-9964-dc38e574c889" Sep 30 05:42:29 crc kubenswrapper[4956]: E0930 05:42:29.470513 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" podUID="0b6d8a4b-faca-4779-be46-219d3c0a3e22" Sep 30 05:42:29 crc kubenswrapper[4956]: E0930 05:42:29.470580 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" podUID="f93913e6-5d74-4030-ac26-a10781a72db0" Sep 30 05:42:29 crc kubenswrapper[4956]: E0930 05:42:29.470619 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef\\\"\"" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" podUID="cbc3fab0-8876-49a7-a85f-4844e253595f" Sep 30 05:42:29 crc kubenswrapper[4956]: E0930 05:42:29.470648 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" podUID="1ed82f79-3f95-4293-937a-f5d82ce37f10" Sep 30 05:42:29 crc kubenswrapper[4956]: I0930 05:42:29.610407 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" podStartSLOduration=4.610392226 podStartE2EDuration="4.610392226s" podCreationTimestamp="2025-09-30 05:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:42:29.608363971 +0000 UTC m=+819.935484486" watchObservedRunningTime="2025-09-30 05:42:29.610392226 +0000 UTC m=+819.937512751" Sep 30 05:42:30 crc kubenswrapper[4956]: I0930 05:42:30.469977 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:30 crc kubenswrapper[4956]: E0930 05:42:30.917682 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Sep 30 05:42:37 crc kubenswrapper[4956]: I0930 05:42:37.287377 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-7nhwl" Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.542447 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" event={"ID":"2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f","Type":"ContainerStarted","Data":"fabf288029b2f8bf9180cfa0088065aa8e06428506d3ba60c746327e8541eb89"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.544948 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" event={"ID":"124abac1-4adc-4a56-8d2b-241e0eb4bf57","Type":"ContainerStarted","Data":"5478885b5ba3fcd9a4b8f714bb9f193858fb32b45ff0d357d429715b82176bcd"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.557429 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" event={"ID":"69909a1b-9121-45ae-aaeb-e63950300ec9","Type":"ContainerStarted","Data":"293fa35d07540c3569e1f94b2eefbaadb61f7cb0ef0a64ae664641ea7ab7a172"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.561549 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" event={"ID":"c0c96af5-0c02-4dfd-91e5-947696cb4899","Type":"ContainerStarted","Data":"c3e0e0cf282cf7232016a66f617eff9705ea1c429151acb38cf8eb3da85f9e7a"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.563425 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" event={"ID":"3c1366b7-aa64-4089-a853-e2027658e237","Type":"ContainerStarted","Data":"4cd4b473828bb1fc7dbd098673a88219da3dac30b970aac7c30b46c15efda9a9"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.564714 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" event={"ID":"c1571b7d-f7d4-470d-90ac-d276a39ea2b1","Type":"ContainerStarted","Data":"c1ad1e4451184224ea10464b662f4a2dc9b58c2c97cb314fd445b42190ed9e43"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.565796 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" event={"ID":"101087b5-cd1e-40f3-916f-5e8f5354ac2d","Type":"ContainerStarted","Data":"721de2fb7f6c7a6cfcdf3b0787a055d7c79f9246e2116e98dbcc3e59bc86c13e"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.566900 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" event={"ID":"ca55e873-96fb-4348-ba98-58ab9648de78","Type":"ContainerStarted","Data":"6cbdfa9b601d70c0856b7bd286d9dcc2b5cbb537b83c9c8a7a236e655128fa8d"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.572631 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" event={"ID":"e8df7824-e9a8-4794-bb91-411ae6639639","Type":"ContainerStarted","Data":"92504931c894281d1b8791173b231f72d118a3b8bb46dede169293c8bc630856"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.582665 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" event={"ID":"8ce74c21-dde5-40bb-8c42-96e4165b8541","Type":"ContainerStarted","Data":"a3e502431db60c3b41061250b0e20c64117d8aba37c2c48b64cef753d52349ae"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.604836 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" event={"ID":"89ba53cc-155b-485b-926c-83eaa0772764","Type":"ContainerStarted","Data":"53409bd2f03025948ae03d7688883f452a9089dc452ec2a21419f3f55db539c8"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.612358 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" event={"ID":"8c84e3e7-f42f-46df-af16-516bc2cac4a0","Type":"ContainerStarted","Data":"0853562af4acad06a61e7354639ce52772edf8490746ca66a5239f7fcca5135d"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.617291 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" event={"ID":"6ee14caa-a939-467a-bdbb-4160d336eaee","Type":"ContainerStarted","Data":"53b1c3f8aaaebbc179d847798003f4549ced94d3918dae67780758e7875635f0"} Sep 30 05:42:38 crc kubenswrapper[4956]: I0930 05:42:38.625161 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj" event={"ID":"49d67534-20e0-48be-9614-eec49889c4a7","Type":"ContainerStarted","Data":"0596a8415f3a5bc678574ef19d71d9c5a1fa98817512239a1f30946423c9f046"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.633479 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" event={"ID":"ca55e873-96fb-4348-ba98-58ab9648de78","Type":"ContainerStarted","Data":"9b01968863f86718766d2e94c9484cffa72c44da20ffbf5688112e39394d9696"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.634258 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.635799 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" event={"ID":"c1571b7d-f7d4-470d-90ac-d276a39ea2b1","Type":"ContainerStarted","Data":"a35acad824aa12c417a50cda75250cf80c7ecc22fdc6cab42d782f8086996b33"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.636016 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.637727 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" event={"ID":"c0c96af5-0c02-4dfd-91e5-947696cb4899","Type":"ContainerStarted","Data":"7e02e860b7d956dfde2eae246dbd8600c6b32cd920ab5a2bcfc46b275542663c"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.637859 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.639536 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" event={"ID":"8ce74c21-dde5-40bb-8c42-96e4165b8541","Type":"ContainerStarted","Data":"efdd3a458da847b6ec129f9846acd7ce9a78e85134b85d4acf3620b3dc2d69e3"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.639734 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.641627 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" event={"ID":"89ba53cc-155b-485b-926c-83eaa0772764","Type":"ContainerStarted","Data":"ea9b1e0477da87f5c053aef8e7bdba41edec98dcb7e849a412663c34919fa134"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.641697 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.643873 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" event={"ID":"8c84e3e7-f42f-46df-af16-516bc2cac4a0","Type":"ContainerStarted","Data":"d45564032d959cec625b7c15f3c2b611d10bf634754a91d8f28281838281d020"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.643990 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.646145 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" event={"ID":"ffab19aa-8b8f-4067-b19c-3ccd9352cb12","Type":"ContainerStarted","Data":"c453baf0d7c2d86489a5e75380900c006ff0cd4a0db947e4d5c6fe964ed64ace"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.646176 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" event={"ID":"ffab19aa-8b8f-4067-b19c-3ccd9352cb12","Type":"ContainerStarted","Data":"33016669ae07530028e624900adaf7109730c0928c6d1b61476a69685f56f330"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.646328 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.647892 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" event={"ID":"e8df7824-e9a8-4794-bb91-411ae6639639","Type":"ContainerStarted","Data":"620f54082f43d3646faf1ebbc02f302be0952257d0b4375988978c41e046888c"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.648025 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.650420 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" event={"ID":"101087b5-cd1e-40f3-916f-5e8f5354ac2d","Type":"ContainerStarted","Data":"786b53db0980b329b3b5073e27f32e6435e417bda877f4784dc4c00569cb8987"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.650837 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.652097 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" event={"ID":"3c1366b7-aa64-4089-a853-e2027658e237","Type":"ContainerStarted","Data":"54db2ec6c7cb1e5d9460e9f22828503e54075b63837e643cae89bf9a1b5cbc1b"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.652193 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.654145 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" event={"ID":"2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f","Type":"ContainerStarted","Data":"2aa5f4818e5adefb08f84f293cc36cfff43e7270b727c0c248dcc1079d563dbd"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.654267 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.657305 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" event={"ID":"6ee14caa-a939-467a-bdbb-4160d336eaee","Type":"ContainerStarted","Data":"7c42dc9b21ae2df236208e177204ce22c5be6894d630939a993bc01121a97271"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.657526 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.660065 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" podStartSLOduration=5.024709242 podStartE2EDuration="15.66004898s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.05628999 +0000 UTC m=+817.383410515" lastFinishedPulling="2025-09-30 05:42:37.691629718 +0000 UTC m=+828.018750253" observedRunningTime="2025-09-30 05:42:39.659283516 +0000 UTC m=+829.986404041" watchObservedRunningTime="2025-09-30 05:42:39.66004898 +0000 UTC m=+829.987169515" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.660196 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" event={"ID":"124abac1-4adc-4a56-8d2b-241e0eb4bf57","Type":"ContainerStarted","Data":"1f152f76f47b86365e45f22aab72fae5486842d69f37ac27a0bd747bbda3d304"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.660362 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.662353 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j2hpj" podStartSLOduration=4.190109274 podStartE2EDuration="14.662344533s" podCreationTimestamp="2025-09-30 05:42:25 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.282775664 +0000 UTC m=+817.609896189" lastFinishedPulling="2025-09-30 05:42:37.755010923 +0000 UTC m=+828.082131448" observedRunningTime="2025-09-30 05:42:38.650565655 +0000 UTC m=+828.977686180" watchObservedRunningTime="2025-09-30 05:42:39.662344533 +0000 UTC m=+829.989465058" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.663380 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" event={"ID":"69909a1b-9121-45ae-aaeb-e63950300ec9","Type":"ContainerStarted","Data":"7815c53ed843328615b6f1c2956fee5a333389a5661517d21635c28559961eef"} Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.663632 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.717867 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" podStartSLOduration=5.012820486 podStartE2EDuration="15.717839548s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.011410821 +0000 UTC m=+817.338531346" lastFinishedPulling="2025-09-30 05:42:37.716429873 +0000 UTC m=+828.043550408" observedRunningTime="2025-09-30 05:42:39.685083212 +0000 UTC m=+830.012203737" watchObservedRunningTime="2025-09-30 05:42:39.717839548 +0000 UTC m=+830.044960063" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.718054 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" podStartSLOduration=3.952635388 podStartE2EDuration="15.718050245s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:25.837147225 +0000 UTC m=+816.164267750" lastFinishedPulling="2025-09-30 05:42:37.602562082 +0000 UTC m=+827.929682607" observedRunningTime="2025-09-30 05:42:39.711505258 +0000 UTC m=+830.038625783" watchObservedRunningTime="2025-09-30 05:42:39.718050245 +0000 UTC m=+830.045170770" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.746449 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" podStartSLOduration=5.04867783 podStartE2EDuration="15.746432802s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.015876332 +0000 UTC m=+817.342996857" lastFinishedPulling="2025-09-30 05:42:37.713631304 +0000 UTC m=+828.040751829" observedRunningTime="2025-09-30 05:42:39.745893065 +0000 UTC m=+830.073013590" watchObservedRunningTime="2025-09-30 05:42:39.746432802 +0000 UTC m=+830.073553327" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.752852 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" podStartSLOduration=4.167461767 podStartE2EDuration="14.752840285s" podCreationTimestamp="2025-09-30 05:42:25 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.01770425 +0000 UTC m=+817.344824775" lastFinishedPulling="2025-09-30 05:42:37.603082768 +0000 UTC m=+827.930203293" observedRunningTime="2025-09-30 05:42:39.733178583 +0000 UTC m=+830.060299128" watchObservedRunningTime="2025-09-30 05:42:39.752840285 +0000 UTC m=+830.079960810" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.789819 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" podStartSLOduration=4.651075157 podStartE2EDuration="15.789012329s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:26.578776479 +0000 UTC m=+816.905897004" lastFinishedPulling="2025-09-30 05:42:37.716713641 +0000 UTC m=+828.043834176" observedRunningTime="2025-09-30 05:42:39.782564465 +0000 UTC m=+830.109684990" watchObservedRunningTime="2025-09-30 05:42:39.789012329 +0000 UTC m=+830.116132854" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.802316 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" podStartSLOduration=4.734178244 podStartE2EDuration="15.802299889s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:26.64647169 +0000 UTC m=+816.973592215" lastFinishedPulling="2025-09-30 05:42:37.714593335 +0000 UTC m=+828.041713860" observedRunningTime="2025-09-30 05:42:39.801975929 +0000 UTC m=+830.129096454" watchObservedRunningTime="2025-09-30 05:42:39.802299889 +0000 UTC m=+830.129420404" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.852768 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" podStartSLOduration=4.212389963 podStartE2EDuration="15.852751605s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:26.003585918 +0000 UTC m=+816.330706443" lastFinishedPulling="2025-09-30 05:42:37.64394754 +0000 UTC m=+827.971068085" observedRunningTime="2025-09-30 05:42:39.847653933 +0000 UTC m=+830.174774458" watchObservedRunningTime="2025-09-30 05:42:39.852751605 +0000 UTC m=+830.179872130" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.871756 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" podStartSLOduration=5.192809599 podStartE2EDuration="15.871740095s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:26.978559462 +0000 UTC m=+817.305679987" lastFinishedPulling="2025-09-30 05:42:37.657489958 +0000 UTC m=+827.984610483" observedRunningTime="2025-09-30 05:42:39.870678091 +0000 UTC m=+830.197798616" watchObservedRunningTime="2025-09-30 05:42:39.871740095 +0000 UTC m=+830.198860620" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.902341 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" podStartSLOduration=5.299913097 podStartE2EDuration="15.902324313s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.010951677 +0000 UTC m=+817.338072202" lastFinishedPulling="2025-09-30 05:42:37.613362893 +0000 UTC m=+827.940483418" observedRunningTime="2025-09-30 05:42:39.900948399 +0000 UTC m=+830.228068924" watchObservedRunningTime="2025-09-30 05:42:39.902324313 +0000 UTC m=+830.229444838" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.948137 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" podStartSLOduration=4.836347046 podStartE2EDuration="15.9481029s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:26.579063328 +0000 UTC m=+816.906183853" lastFinishedPulling="2025-09-30 05:42:37.690819182 +0000 UTC m=+828.017939707" observedRunningTime="2025-09-30 05:42:39.940336085 +0000 UTC m=+830.267456610" watchObservedRunningTime="2025-09-30 05:42:39.9481029 +0000 UTC m=+830.275223425" Sep 30 05:42:39 crc kubenswrapper[4956]: I0930 05:42:39.978199 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" podStartSLOduration=4.928569242 podStartE2EDuration="15.978184631s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:26.607187698 +0000 UTC m=+816.934308233" lastFinishedPulling="2025-09-30 05:42:37.656803097 +0000 UTC m=+827.983923622" observedRunningTime="2025-09-30 05:42:39.974398992 +0000 UTC m=+830.301519527" watchObservedRunningTime="2025-09-30 05:42:39.978184631 +0000 UTC m=+830.305305156" Sep 30 05:42:40 crc kubenswrapper[4956]: I0930 05:42:40.003524 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" podStartSLOduration=5.370784026 podStartE2EDuration="16.003507952s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.011217175 +0000 UTC m=+817.338337700" lastFinishedPulling="2025-09-30 05:42:37.64394109 +0000 UTC m=+827.971061626" observedRunningTime="2025-09-30 05:42:40.000200238 +0000 UTC m=+830.327320773" watchObservedRunningTime="2025-09-30 05:42:40.003507952 +0000 UTC m=+830.330628477" Sep 30 05:42:40 crc kubenswrapper[4956]: I0930 05:42:40.043615 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" podStartSLOduration=5.41102268 podStartE2EDuration="16.04359699s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.011780103 +0000 UTC m=+817.338900628" lastFinishedPulling="2025-09-30 05:42:37.644354413 +0000 UTC m=+827.971474938" observedRunningTime="2025-09-30 05:42:40.040072049 +0000 UTC m=+830.367192574" watchObservedRunningTime="2025-09-30 05:42:40.04359699 +0000 UTC m=+830.370717515" Sep 30 05:42:43 crc kubenswrapper[4956]: I0930 05:42:43.342796 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.703715 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" event={"ID":"0b6d8a4b-faca-4779-be46-219d3c0a3e22","Type":"ContainerStarted","Data":"71220b713bb362bda7f9419f9fca17189e4fd81d2c467a1837e7ea7a296b29ad"} Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.704926 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.705483 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" event={"ID":"c07dffb1-ebd0-44e9-8061-ce680870aba3","Type":"ContainerStarted","Data":"6213aed107864438695a2d8203d55be0ee1afca8b00461e8d53fac1bf0255bbc"} Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.705744 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.707853 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" event={"ID":"9cdfca4b-0805-4ebc-92e1-906044d82e4b","Type":"ContainerStarted","Data":"4efc470a1116c303cfcbdd49ff93933c85e4ccaad3b7fd194f4bd202a23523c6"} Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.708404 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.710756 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" event={"ID":"f93913e6-5d74-4030-ac26-a10781a72db0","Type":"ContainerStarted","Data":"62be50fa46cb74a1b4db620ba651e00e7ed78f8a9e7d38ef7419744ddfa026c3"} Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.711089 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.712480 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" event={"ID":"cbc3fab0-8876-49a7-a85f-4844e253595f","Type":"ContainerStarted","Data":"e117bc0b1d6b5404caf667afa5690a55d6caa7be3ad26d1e4e6bb7cff68ed3d9"} Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.712922 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.714549 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" event={"ID":"f620cf06-9ba1-4866-9964-dc38e574c889","Type":"ContainerStarted","Data":"89d28529b7bd5f9dff71c032d8a9e0ccdd5e425d8a9cd991a14457ad68401395"} Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.714960 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.733816 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" podStartSLOduration=3.73040381 podStartE2EDuration="20.73380322s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.313606249 +0000 UTC m=+817.640726774" lastFinishedPulling="2025-09-30 05:42:44.317005659 +0000 UTC m=+834.644126184" observedRunningTime="2025-09-30 05:42:44.730576228 +0000 UTC m=+835.057696753" watchObservedRunningTime="2025-09-30 05:42:44.73380322 +0000 UTC m=+835.060923745" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.750083 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" podStartSLOduration=3.7034665479999997 podStartE2EDuration="20.750069764s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.286372257 +0000 UTC m=+817.613492782" lastFinishedPulling="2025-09-30 05:42:44.332975473 +0000 UTC m=+834.660095998" observedRunningTime="2025-09-30 05:42:44.748102973 +0000 UTC m=+835.075223498" watchObservedRunningTime="2025-09-30 05:42:44.750069764 +0000 UTC m=+835.077190289" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.769529 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" podStartSLOduration=3.733177778 podStartE2EDuration="20.76951227s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.283039631 +0000 UTC m=+817.610160156" lastFinishedPulling="2025-09-30 05:42:44.319374123 +0000 UTC m=+834.646494648" observedRunningTime="2025-09-30 05:42:44.767230258 +0000 UTC m=+835.094350783" watchObservedRunningTime="2025-09-30 05:42:44.76951227 +0000 UTC m=+835.096632795" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.801691 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" podStartSLOduration=2.5395836149999997 podStartE2EDuration="19.801669456s" podCreationTimestamp="2025-09-30 05:42:25 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.056377204 +0000 UTC m=+817.383497719" lastFinishedPulling="2025-09-30 05:42:44.318463025 +0000 UTC m=+834.645583560" observedRunningTime="2025-09-30 05:42:44.801541752 +0000 UTC m=+835.128662287" watchObservedRunningTime="2025-09-30 05:42:44.801669456 +0000 UTC m=+835.128789991" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.826875 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" podStartSLOduration=3.512444687 podStartE2EDuration="20.826851573s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.064314904 +0000 UTC m=+817.391435429" lastFinishedPulling="2025-09-30 05:42:44.37872179 +0000 UTC m=+834.705842315" observedRunningTime="2025-09-30 05:42:44.82454799 +0000 UTC m=+835.151668525" watchObservedRunningTime="2025-09-30 05:42:44.826851573 +0000 UTC m=+835.153972118" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.854579 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" podStartSLOduration=2.819662273 podStartE2EDuration="19.854562539s" podCreationTimestamp="2025-09-30 05:42:25 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.283771695 +0000 UTC m=+817.610892220" lastFinishedPulling="2025-09-30 05:42:44.318671941 +0000 UTC m=+834.645792486" observedRunningTime="2025-09-30 05:42:44.849612192 +0000 UTC m=+835.176732727" watchObservedRunningTime="2025-09-30 05:42:44.854562539 +0000 UTC m=+835.181683064" Sep 30 05:42:44 crc kubenswrapper[4956]: I0930 05:42:44.945281 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-mrz5k" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.021576 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-kvfmc" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.097851 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-mklbg" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.138462 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-ncr6h" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.245173 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-jvf6n" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.271489 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-hhv4p" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.288628 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-6hssf" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.294348 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-nghm2" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.356917 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-m8xs8" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.390517 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-5x2c8" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.477619 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-v888v" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.576044 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-xh7wx" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.613018 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-jz8bg" Sep 30 05:42:45 crc kubenswrapper[4956]: I0930 05:42:45.758392 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-g85xg" Sep 30 05:42:46 crc kubenswrapper[4956]: I0930 05:42:46.734104 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" event={"ID":"1ed82f79-3f95-4293-937a-f5d82ce37f10","Type":"ContainerStarted","Data":"7e25c3e86b374ea799254d4f40409d91e5b4d01c2c2cb5b007d88c25b3de5e22"} Sep 30 05:42:46 crc kubenswrapper[4956]: I0930 05:42:46.734489 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" Sep 30 05:42:46 crc kubenswrapper[4956]: I0930 05:42:46.750312 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" podStartSLOduration=4.191272566 podStartE2EDuration="22.750293143s" podCreationTimestamp="2025-09-30 05:42:24 +0000 UTC" firstStartedPulling="2025-09-30 05:42:27.285094757 +0000 UTC m=+817.612215282" lastFinishedPulling="2025-09-30 05:42:45.844115334 +0000 UTC m=+836.171235859" observedRunningTime="2025-09-30 05:42:46.746988738 +0000 UTC m=+837.074109253" watchObservedRunningTime="2025-09-30 05:42:46.750293143 +0000 UTC m=+837.077413668" Sep 30 05:42:55 crc kubenswrapper[4956]: I0930 05:42:55.387934 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-nhj4d" Sep 30 05:42:55 crc kubenswrapper[4956]: I0930 05:42:55.403968 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-j79gf" Sep 30 05:42:55 crc kubenswrapper[4956]: I0930 05:42:55.446271 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-2rcv8" Sep 30 05:42:55 crc kubenswrapper[4956]: I0930 05:42:55.580240 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-6hgwr" Sep 30 05:42:55 crc kubenswrapper[4956]: I0930 05:42:55.599779 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-wxgr6" Sep 30 05:42:55 crc kubenswrapper[4956]: I0930 05:42:55.615087 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-sbn5x" Sep 30 05:42:56 crc kubenswrapper[4956]: I0930 05:42:56.069541 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.364838 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-7qh59"] Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.366409 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.370305 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.370549 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.370790 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-c2jm8" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.370940 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.387054 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-7qh59"] Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.468272 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-sxs9t"] Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.469780 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.472302 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.484663 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-sxs9t"] Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.502732 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdac8ebb-8d05-4766-9f09-0d6084d2168f-config\") pod \"dnsmasq-dns-8468885bfc-7qh59\" (UID: \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\") " pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.502777 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588t9\" (UniqueName: \"kubernetes.io/projected/bdac8ebb-8d05-4766-9f09-0d6084d2168f-kube-api-access-588t9\") pod \"dnsmasq-dns-8468885bfc-7qh59\" (UID: \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\") " pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.605218 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdac8ebb-8d05-4766-9f09-0d6084d2168f-config\") pod \"dnsmasq-dns-8468885bfc-7qh59\" (UID: \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\") " pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.605295 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588t9\" (UniqueName: \"kubernetes.io/projected/bdac8ebb-8d05-4766-9f09-0d6084d2168f-kube-api-access-588t9\") pod \"dnsmasq-dns-8468885bfc-7qh59\" (UID: \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\") " pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.605325 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g525k\" (UniqueName: \"kubernetes.io/projected/0476d8b4-b1a2-4575-8e7e-363da630e6c3-kube-api-access-g525k\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.605405 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.605430 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-config\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.606430 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdac8ebb-8d05-4766-9f09-0d6084d2168f-config\") pod \"dnsmasq-dns-8468885bfc-7qh59\" (UID: \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\") " pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.631999 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588t9\" (UniqueName: \"kubernetes.io/projected/bdac8ebb-8d05-4766-9f09-0d6084d2168f-kube-api-access-588t9\") pod \"dnsmasq-dns-8468885bfc-7qh59\" (UID: \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\") " pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.683737 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.706861 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.706908 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-config\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.706947 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g525k\" (UniqueName: \"kubernetes.io/projected/0476d8b4-b1a2-4575-8e7e-363da630e6c3-kube-api-access-g525k\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.708061 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-config\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.708170 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.723973 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g525k\" (UniqueName: \"kubernetes.io/projected/0476d8b4-b1a2-4575-8e7e-363da630e6c3-kube-api-access-g525k\") pod \"dnsmasq-dns-545d49fd5c-sxs9t\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:14 crc kubenswrapper[4956]: I0930 05:43:14.790882 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:43:15 crc kubenswrapper[4956]: I0930 05:43:15.113663 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-7qh59"] Sep 30 05:43:15 crc kubenswrapper[4956]: I0930 05:43:15.237786 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-sxs9t"] Sep 30 05:43:15 crc kubenswrapper[4956]: W0930 05:43:15.253426 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0476d8b4_b1a2_4575_8e7e_363da630e6c3.slice/crio-d5ec7611b14ee7ce301e12c665093eed9d6c2197c5bdb9a061358cf5d5b0a42f WatchSource:0}: Error finding container d5ec7611b14ee7ce301e12c665093eed9d6c2197c5bdb9a061358cf5d5b0a42f: Status 404 returned error can't find the container with id d5ec7611b14ee7ce301e12c665093eed9d6c2197c5bdb9a061358cf5d5b0a42f Sep 30 05:43:15 crc kubenswrapper[4956]: I0930 05:43:15.963400 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" event={"ID":"0476d8b4-b1a2-4575-8e7e-363da630e6c3","Type":"ContainerStarted","Data":"d5ec7611b14ee7ce301e12c665093eed9d6c2197c5bdb9a061358cf5d5b0a42f"} Sep 30 05:43:15 crc kubenswrapper[4956]: I0930 05:43:15.964560 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8468885bfc-7qh59" event={"ID":"bdac8ebb-8d05-4766-9f09-0d6084d2168f","Type":"ContainerStarted","Data":"b8b5d65c8d744b2336bf8cfd50b7a6b8b038500c5d00d5b04084235f8a018e85"} Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.073170 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.073226 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.767083 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-sxs9t"] Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.795047 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-bv9sv"] Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.796460 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.805345 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-bv9sv"] Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.899309 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.899394 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-config\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:18 crc kubenswrapper[4956]: I0930 05:43:18.899439 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dq2b\" (UniqueName: \"kubernetes.io/projected/848f53ca-813b-4f49-ba68-8d7a22732438-kube-api-access-6dq2b\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.000617 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dq2b\" (UniqueName: \"kubernetes.io/projected/848f53ca-813b-4f49-ba68-8d7a22732438-kube-api-access-6dq2b\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.000704 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.000767 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-config\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.001661 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-dns-svc\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.001688 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-config\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.035178 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dq2b\" (UniqueName: \"kubernetes.io/projected/848f53ca-813b-4f49-ba68-8d7a22732438-kube-api-access-6dq2b\") pod \"dnsmasq-dns-b9b4959cc-bv9sv\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.102418 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-7qh59"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.123433 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-vpm8h"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.125478 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.126809 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.139741 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-vpm8h"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.203057 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-config\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.203147 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.203179 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9zsj\" (UniqueName: \"kubernetes.io/projected/505e587f-c235-464f-9bea-d5ae3f8fb219-kube-api-access-j9zsj\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.305219 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-config\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.305323 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.305372 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9zsj\" (UniqueName: \"kubernetes.io/projected/505e587f-c235-464f-9bea-d5ae3f8fb219-kube-api-access-j9zsj\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.307345 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-config\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.308231 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-dns-svc\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.336269 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9zsj\" (UniqueName: \"kubernetes.io/projected/505e587f-c235-464f-9bea-d5ae3f8fb219-kube-api-access-j9zsj\") pod \"dnsmasq-dns-86b8f4ff9-vpm8h\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.372697 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-bv9sv"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.425755 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lmpj4"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.431024 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.433273 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lmpj4"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.508486 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.509009 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-dns-svc\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.509078 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n78sp\" (UniqueName: \"kubernetes.io/projected/093f99c7-9f50-497d-b89c-33887663166b-kube-api-access-n78sp\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.509178 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-config\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.611530 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-config\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.611668 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-dns-svc\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.611702 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n78sp\" (UniqueName: \"kubernetes.io/projected/093f99c7-9f50-497d-b89c-33887663166b-kube-api-access-n78sp\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.612592 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-config\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.612981 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-dns-svc\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.629167 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n78sp\" (UniqueName: \"kubernetes.io/projected/093f99c7-9f50-497d-b89c-33887663166b-kube-api-access-n78sp\") pod \"dnsmasq-dns-5449989c59-lmpj4\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.765492 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.770439 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-bv9sv"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.940079 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-vpm8h"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.961838 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.964286 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.966549 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8hjkc" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.966651 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.966904 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.966944 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.967047 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.968956 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.969157 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 05:43:19 crc kubenswrapper[4956]: I0930 05:43:19.975578 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.012591 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" event={"ID":"848f53ca-813b-4f49-ba68-8d7a22732438","Type":"ContainerStarted","Data":"b6d873318d8a81b4a23e8e5426b0896c875ef93f395aa5b2333b41c80ec18172"} Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.043240 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" event={"ID":"505e587f-c235-464f-9bea-d5ae3f8fb219","Type":"ContainerStarted","Data":"380ea132b18fe1a00d58feef573a39a47c70675aa1e7510bcd24bec285d38a45"} Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.143985 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144036 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144161 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2s4\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-kube-api-access-6j2s4\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144189 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144260 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144295 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144343 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144380 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ae54b47-b5ac-43a0-9752-797d2f81ff29-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144411 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144429 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.144466 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ae54b47-b5ac-43a0-9752-797d2f81ff29-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.227636 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.229450 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.235358 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.235380 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rxl6n" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.235481 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.235588 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.235358 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.235740 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.236047 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.241062 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.245996 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2s4\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-kube-api-access-6j2s4\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246050 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246105 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246147 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246189 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246217 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ae54b47-b5ac-43a0-9752-797d2f81ff29-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246278 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246305 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ae54b47-b5ac-43a0-9752-797d2f81ff29-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246333 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246389 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.246414 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.248584 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.250083 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.250662 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.257193 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.258169 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.257211 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.261584 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.262286 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ae54b47-b5ac-43a0-9752-797d2f81ff29-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.266148 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.268744 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ae54b47-b5ac-43a0-9752-797d2f81ff29-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.291988 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2s4\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-kube-api-access-6j2s4\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.306068 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347394 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347455 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347474 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347504 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347573 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347591 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347617 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347632 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347654 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dtpc\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-kube-api-access-4dtpc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.347695 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.348352 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.416618 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lmpj4"] Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449191 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449233 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449257 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dtpc\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-kube-api-access-4dtpc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449294 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449317 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449361 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449394 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449409 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449438 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449464 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449488 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.449859 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.450258 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.450494 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.452435 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.452660 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.452790 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.458267 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.461365 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.467183 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.476258 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.476390 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dtpc\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-kube-api-access-4dtpc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.479043 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.523170 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.525175 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.534480 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.534724 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.534495 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.534812 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.534815 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.535081 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-h689c" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.535520 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.562009 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.596175 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.654956 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655275 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655297 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de3a8c94-71b5-4948-9079-cc7009b9a8ea-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655321 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655345 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dpwf\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-kube-api-access-9dpwf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655369 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655384 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655418 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655433 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655451 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.655470 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de3a8c94-71b5-4948-9079-cc7009b9a8ea-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.656997 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757053 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757147 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de3a8c94-71b5-4948-9079-cc7009b9a8ea-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757181 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757223 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dpwf\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-kube-api-access-9dpwf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757302 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757328 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757558 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757931 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.757982 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.758006 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.758034 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de3a8c94-71b5-4948-9079-cc7009b9a8ea-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.758106 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.758920 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.758935 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.759277 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.759450 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.759837 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de3a8c94-71b5-4948-9079-cc7009b9a8ea-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.763957 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de3a8c94-71b5-4948-9079-cc7009b9a8ea-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.764713 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.764927 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de3a8c94-71b5-4948-9079-cc7009b9a8ea-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.771977 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.779734 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dpwf\" (UniqueName: \"kubernetes.io/projected/de3a8c94-71b5-4948-9079-cc7009b9a8ea-kube-api-access-9dpwf\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.801341 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"de3a8c94-71b5-4948-9079-cc7009b9a8ea\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:20 crc kubenswrapper[4956]: I0930 05:43:20.861587 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:43:21 crc kubenswrapper[4956]: I0930 05:43:21.052581 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5449989c59-lmpj4" event={"ID":"093f99c7-9f50-497d-b89c-33887663166b","Type":"ContainerStarted","Data":"1b415c165fc4546d286cb388fdb4e92b9842a8072113f9d8edf48c1622e8afb1"} Sep 30 05:43:21 crc kubenswrapper[4956]: I0930 05:43:21.090037 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:43:21 crc kubenswrapper[4956]: I0930 05:43:21.159947 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:43:21 crc kubenswrapper[4956]: W0930 05:43:21.168244 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fd85fdd_3cb1_4a7e_9aea_8823050ae1f4.slice/crio-7fef3a34e10a3a8f9c7f222c85bddcca9409b06a47aa55d2c9065e71a5c8709a WatchSource:0}: Error finding container 7fef3a34e10a3a8f9c7f222c85bddcca9409b06a47aa55d2c9065e71a5c8709a: Status 404 returned error can't find the container with id 7fef3a34e10a3a8f9c7f222c85bddcca9409b06a47aa55d2c9065e71a5c8709a Sep 30 05:43:21 crc kubenswrapper[4956]: I0930 05:43:21.396335 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 05:43:22 crc kubenswrapper[4956]: I0930 05:43:22.060669 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4","Type":"ContainerStarted","Data":"7fef3a34e10a3a8f9c7f222c85bddcca9409b06a47aa55d2c9065e71a5c8709a"} Sep 30 05:43:22 crc kubenswrapper[4956]: I0930 05:43:22.061990 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"de3a8c94-71b5-4948-9079-cc7009b9a8ea","Type":"ContainerStarted","Data":"71d76da1ec998cfa88fe8c92b88104a0717e2a5040511e48a3b7656affae89ab"} Sep 30 05:43:22 crc kubenswrapper[4956]: I0930 05:43:22.063271 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ae54b47-b5ac-43a0-9752-797d2f81ff29","Type":"ContainerStarted","Data":"d0d5a22f3362a82099890e3ce6be4ade017eca0e60431422b5b4bb86f161a72a"} Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.703471 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.705327 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.709893 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.716729 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.717031 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.717191 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.717891 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bgp5p" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.720424 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.724422 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.821110 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.822754 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.824972 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.825956 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-x7mvb" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.826313 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.833524 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.831062 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.842053 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-config-data-default\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.841737 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.844248 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlnzj\" (UniqueName: \"kubernetes.io/projected/541322a8-d098-4331-ab5b-500262d4655c-kube-api-access-nlnzj\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.844317 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.844336 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.844364 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-kolla-config\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.844416 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-secrets\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.844445 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.844465 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/541322a8-d098-4331-ab5b-500262d4655c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946486 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946531 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-config-data-default\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946558 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946596 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlnzj\" (UniqueName: \"kubernetes.io/projected/541322a8-d098-4331-ab5b-500262d4655c-kube-api-access-nlnzj\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946624 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946639 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946652 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946669 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946686 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-kolla-config\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946706 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rv9z\" (UniqueName: \"kubernetes.io/projected/e0e95031-1e0d-4979-9926-ba52d0208646-kube-api-access-2rv9z\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946740 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946759 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-secrets\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946779 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946799 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946819 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/541322a8-d098-4331-ab5b-500262d4655c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946854 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0e95031-1e0d-4979-9926-ba52d0208646-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946872 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.946893 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.947253 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.947685 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-kolla-config\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.948503 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-config-data-default\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.949951 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541322a8-d098-4331-ab5b-500262d4655c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.951678 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/541322a8-d098-4331-ab5b-500262d4655c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.958600 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.964224 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.968464 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlnzj\" (UniqueName: \"kubernetes.io/projected/541322a8-d098-4331-ab5b-500262d4655c-kube-api-access-nlnzj\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.973660 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/541322a8-d098-4331-ab5b-500262d4655c-secrets\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:23 crc kubenswrapper[4956]: I0930 05:43:23.982951 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"541322a8-d098-4331-ab5b-500262d4655c\") " pod="openstack/openstack-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.042597 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049098 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049160 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049204 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0e95031-1e0d-4979-9926-ba52d0208646-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049222 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049243 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049281 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049325 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049345 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049364 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rv9z\" (UniqueName: \"kubernetes.io/projected/e0e95031-1e0d-4979-9926-ba52d0208646-kube-api-access-2rv9z\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.049995 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.050490 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0e95031-1e0d-4979-9926-ba52d0208646-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.050796 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.050840 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.051687 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e95031-1e0d-4979-9926-ba52d0208646-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.062097 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.062529 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.082822 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e95031-1e0d-4979-9926-ba52d0208646-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.093249 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rv9z\" (UniqueName: \"kubernetes.io/projected/e0e95031-1e0d-4979-9926-ba52d0208646-kube-api-access-2rv9z\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.103822 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0e95031-1e0d-4979-9926-ba52d0208646\") " pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.137674 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.139006 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.142621 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6szpk" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.142976 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.143147 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.153046 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.158382 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.254467 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5126f1e-65cb-4032-9df3-8cd061c43253-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.254571 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b5126f1e-65cb-4032-9df3-8cd061c43253-kolla-config\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.254593 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgfgw\" (UniqueName: \"kubernetes.io/projected/b5126f1e-65cb-4032-9df3-8cd061c43253-kube-api-access-rgfgw\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.254615 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5126f1e-65cb-4032-9df3-8cd061c43253-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.254663 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5126f1e-65cb-4032-9df3-8cd061c43253-config-data\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.364804 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5126f1e-65cb-4032-9df3-8cd061c43253-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.367423 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b5126f1e-65cb-4032-9df3-8cd061c43253-kolla-config\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.367462 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgfgw\" (UniqueName: \"kubernetes.io/projected/b5126f1e-65cb-4032-9df3-8cd061c43253-kube-api-access-rgfgw\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.367536 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5126f1e-65cb-4032-9df3-8cd061c43253-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.367734 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5126f1e-65cb-4032-9df3-8cd061c43253-config-data\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.370368 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5126f1e-65cb-4032-9df3-8cd061c43253-config-data\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.375696 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5126f1e-65cb-4032-9df3-8cd061c43253-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.378385 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5126f1e-65cb-4032-9df3-8cd061c43253-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.385175 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b5126f1e-65cb-4032-9df3-8cd061c43253-kolla-config\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.404232 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgfgw\" (UniqueName: \"kubernetes.io/projected/b5126f1e-65cb-4032-9df3-8cd061c43253-kube-api-access-rgfgw\") pod \"memcached-0\" (UID: \"b5126f1e-65cb-4032-9df3-8cd061c43253\") " pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.487995 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.801541 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 05:43:24 crc kubenswrapper[4956]: I0930 05:43:24.965751 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 05:43:24 crc kubenswrapper[4956]: W0930 05:43:24.973522 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0e95031_1e0d_4979_9926_ba52d0208646.slice/crio-12e58303ea8c0c62e85e55b867570db46b7935d2f823937ea77450eb93039b05 WatchSource:0}: Error finding container 12e58303ea8c0c62e85e55b867570db46b7935d2f823937ea77450eb93039b05: Status 404 returned error can't find the container with id 12e58303ea8c0c62e85e55b867570db46b7935d2f823937ea77450eb93039b05 Sep 30 05:43:25 crc kubenswrapper[4956]: I0930 05:43:25.092189 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 05:43:25 crc kubenswrapper[4956]: W0930 05:43:25.096922 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5126f1e_65cb_4032_9df3_8cd061c43253.slice/crio-3e32bc4efb3c8d801ed34f3f6ed8070e03e3015409499f480bbad66af66e8c98 WatchSource:0}: Error finding container 3e32bc4efb3c8d801ed34f3f6ed8070e03e3015409499f480bbad66af66e8c98: Status 404 returned error can't find the container with id 3e32bc4efb3c8d801ed34f3f6ed8070e03e3015409499f480bbad66af66e8c98 Sep 30 05:43:25 crc kubenswrapper[4956]: I0930 05:43:25.142172 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0e95031-1e0d-4979-9926-ba52d0208646","Type":"ContainerStarted","Data":"12e58303ea8c0c62e85e55b867570db46b7935d2f823937ea77450eb93039b05"} Sep 30 05:43:25 crc kubenswrapper[4956]: I0930 05:43:25.146825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"541322a8-d098-4331-ab5b-500262d4655c","Type":"ContainerStarted","Data":"9ca9ea9c27e40f046b11373e90ede2777fec24c451077807911d3e4eb376ebf0"} Sep 30 05:43:25 crc kubenswrapper[4956]: I0930 05:43:25.149337 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b5126f1e-65cb-4032-9df3-8cd061c43253","Type":"ContainerStarted","Data":"3e32bc4efb3c8d801ed34f3f6ed8070e03e3015409499f480bbad66af66e8c98"} Sep 30 05:43:25 crc kubenswrapper[4956]: I0930 05:43:25.926249 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:43:25 crc kubenswrapper[4956]: I0930 05:43:25.927256 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 05:43:25 crc kubenswrapper[4956]: I0930 05:43:25.937998 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2rzfg" Sep 30 05:43:25 crc kubenswrapper[4956]: I0930 05:43:25.951634 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:43:26 crc kubenswrapper[4956]: I0930 05:43:26.012936 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvsss\" (UniqueName: \"kubernetes.io/projected/5994e7b3-410f-47d8-9aa1-ddd019b123ec-kube-api-access-lvsss\") pod \"kube-state-metrics-0\" (UID: \"5994e7b3-410f-47d8-9aa1-ddd019b123ec\") " pod="openstack/kube-state-metrics-0" Sep 30 05:43:26 crc kubenswrapper[4956]: I0930 05:43:26.116067 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvsss\" (UniqueName: \"kubernetes.io/projected/5994e7b3-410f-47d8-9aa1-ddd019b123ec-kube-api-access-lvsss\") pod \"kube-state-metrics-0\" (UID: \"5994e7b3-410f-47d8-9aa1-ddd019b123ec\") " pod="openstack/kube-state-metrics-0" Sep 30 05:43:26 crc kubenswrapper[4956]: I0930 05:43:26.157594 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvsss\" (UniqueName: \"kubernetes.io/projected/5994e7b3-410f-47d8-9aa1-ddd019b123ec-kube-api-access-lvsss\") pod \"kube-state-metrics-0\" (UID: \"5994e7b3-410f-47d8-9aa1-ddd019b123ec\") " pod="openstack/kube-state-metrics-0" Sep 30 05:43:26 crc kubenswrapper[4956]: I0930 05:43:26.303963 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.016793 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:43:27 crc kubenswrapper[4956]: W0930 05:43:27.035050 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5994e7b3_410f_47d8_9aa1_ddd019b123ec.slice/crio-bb6e09d6964112276353a46a1fb2704c2644aa5f5ef18831de79ec6cfcbacdc6 WatchSource:0}: Error finding container bb6e09d6964112276353a46a1fb2704c2644aa5f5ef18831de79ec6cfcbacdc6: Status 404 returned error can't find the container with id bb6e09d6964112276353a46a1fb2704c2644aa5f5ef18831de79ec6cfcbacdc6 Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.256649 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5994e7b3-410f-47d8-9aa1-ddd019b123ec","Type":"ContainerStarted","Data":"bb6e09d6964112276353a46a1fb2704c2644aa5f5ef18831de79ec6cfcbacdc6"} Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.257681 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.259759 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.263867 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.263910 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.264042 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wmk85" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.266017 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.266237 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.271491 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.303428 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.340364 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.340407 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.340430 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.340474 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f68c4eb3-451f-460b-996f-3a36ac7da7e2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.340493 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.340509 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btpq5\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-kube-api-access-btpq5\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.340538 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.340564 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.443020 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.443080 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.443242 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.443269 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.443285 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.443341 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f68c4eb3-451f-460b-996f-3a36ac7da7e2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.443369 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.443384 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btpq5\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-kube-api-access-btpq5\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.446583 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f68c4eb3-451f-460b-996f-3a36ac7da7e2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.448970 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.449322 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.449741 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.449969 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.451378 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.451402 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dd69768781dcae5046075cd89d0649334b1721ac4e99f4c4ea45de29b58bc7b6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.456800 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.458849 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btpq5\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-kube-api-access-btpq5\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.484533 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:27 crc kubenswrapper[4956]: I0930 05:43:27.592046 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 05:43:28 crc kubenswrapper[4956]: I0930 05:43:28.144377 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.302620 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r7wfs"] Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.303593 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.307417 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.307445 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.307415 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-g77p2" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.310579 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r7wfs"] Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.311176 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerStarted","Data":"15fb3583c205c5af8fc5383f2d4602dbaef094800684712599f1b72cc950bb55"} Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.376898 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-p9lvd"] Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.380170 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.390065 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p9lvd"] Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.421927 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/611676cd-11d2-44c4-bae2-41b6b22f898d-ovn-controller-tls-certs\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.421996 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611676cd-11d2-44c4-bae2-41b6b22f898d-combined-ca-bundle\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.422028 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/611676cd-11d2-44c4-bae2-41b6b22f898d-scripts\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.422063 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fndx4\" (UniqueName: \"kubernetes.io/projected/611676cd-11d2-44c4-bae2-41b6b22f898d-kube-api-access-fndx4\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.422084 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-run-ovn\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.422106 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-log-ovn\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.422204 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-run\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.524910 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-run\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525086 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/611676cd-11d2-44c4-bae2-41b6b22f898d-ovn-controller-tls-certs\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525264 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-lib\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525414 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611676cd-11d2-44c4-bae2-41b6b22f898d-combined-ca-bundle\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525586 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k8d8\" (UniqueName: \"kubernetes.io/projected/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-kube-api-access-2k8d8\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525678 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/611676cd-11d2-44c4-bae2-41b6b22f898d-scripts\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525751 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-scripts\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525829 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-etc-ovs\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525837 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-run\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.525942 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fndx4\" (UniqueName: \"kubernetes.io/projected/611676cd-11d2-44c4-bae2-41b6b22f898d-kube-api-access-fndx4\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.526002 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-run-ovn\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.526047 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-run\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.526102 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-log-ovn\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.526225 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-log\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.529343 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-run-ovn\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.529522 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/611676cd-11d2-44c4-bae2-41b6b22f898d-var-log-ovn\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.535818 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611676cd-11d2-44c4-bae2-41b6b22f898d-combined-ca-bundle\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.542009 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/611676cd-11d2-44c4-bae2-41b6b22f898d-scripts\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.546651 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fndx4\" (UniqueName: \"kubernetes.io/projected/611676cd-11d2-44c4-bae2-41b6b22f898d-kube-api-access-fndx4\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.555183 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/611676cd-11d2-44c4-bae2-41b6b22f898d-ovn-controller-tls-certs\") pod \"ovn-controller-r7wfs\" (UID: \"611676cd-11d2-44c4-bae2-41b6b22f898d\") " pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.627581 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-lib\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.627642 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k8d8\" (UniqueName: \"kubernetes.io/projected/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-kube-api-access-2k8d8\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.627670 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-scripts\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.627694 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-etc-ovs\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.627724 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-run\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.627746 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-log\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.628110 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-log\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.628271 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-lib\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.630292 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-scripts\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.630451 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-etc-ovs\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.630508 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-var-run\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.637224 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.660059 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k8d8\" (UniqueName: \"kubernetes.io/projected/f61a8fc3-4802-4dfb-b17e-fd4b8db1b863-kube-api-access-2k8d8\") pod \"ovn-controller-ovs-p9lvd\" (UID: \"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863\") " pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:29 crc kubenswrapper[4956]: I0930 05:43:29.702508 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:43:30 crc kubenswrapper[4956]: I0930 05:43:30.912881 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 05:43:30 crc kubenswrapper[4956]: I0930 05:43:30.914498 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:30 crc kubenswrapper[4956]: I0930 05:43:30.916344 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-hfjs8" Sep 30 05:43:30 crc kubenswrapper[4956]: I0930 05:43:30.917861 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 05:43:30 crc kubenswrapper[4956]: I0930 05:43:30.917936 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 05:43:30 crc kubenswrapper[4956]: I0930 05:43:30.918079 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 05:43:30 crc kubenswrapper[4956]: I0930 05:43:30.920017 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 05:43:30 crc kubenswrapper[4956]: I0930 05:43:30.931609 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.060357 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82458356-9089-4c9d-a672-746eb618af3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.060437 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xwx\" (UniqueName: \"kubernetes.io/projected/82458356-9089-4c9d-a672-746eb618af3d-kube-api-access-n5xwx\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.062379 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82458356-9089-4c9d-a672-746eb618af3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.062566 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.062618 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.062707 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.062993 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.063159 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82458356-9089-4c9d-a672-746eb618af3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165232 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165369 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165409 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82458356-9089-4c9d-a672-746eb618af3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165455 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82458356-9089-4c9d-a672-746eb618af3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165504 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xwx\" (UniqueName: \"kubernetes.io/projected/82458356-9089-4c9d-a672-746eb618af3d-kube-api-access-n5xwx\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165537 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82458356-9089-4c9d-a672-746eb618af3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165574 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165593 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.165850 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.166559 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82458356-9089-4c9d-a672-746eb618af3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.167132 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82458356-9089-4c9d-a672-746eb618af3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.171030 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82458356-9089-4c9d-a672-746eb618af3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.173717 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.178230 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.201011 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xwx\" (UniqueName: \"kubernetes.io/projected/82458356-9089-4c9d-a672-746eb618af3d-kube-api-access-n5xwx\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.205635 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82458356-9089-4c9d-a672-746eb618af3d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.227305 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"82458356-9089-4c9d-a672-746eb618af3d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:31 crc kubenswrapper[4956]: I0930 05:43:31.253776 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.160330 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.167839 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.173648 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.174015 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jv7t7" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.174240 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.175949 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.194278 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.307886 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.307929 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.308060 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss5gq\" (UniqueName: \"kubernetes.io/projected/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-kube-api-access-ss5gq\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.308090 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.308134 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.308158 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.308175 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.308208 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.410016 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5gq\" (UniqueName: \"kubernetes.io/projected/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-kube-api-access-ss5gq\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.410071 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.410095 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.410113 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.410143 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.410178 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.410206 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.410223 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.412613 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.412770 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.412787 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.413080 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.417263 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.425389 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.432730 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5gq\" (UniqueName: \"kubernetes.io/projected/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-kube-api-access-ss5gq\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.433026 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd6f01ba-ec4b-4b98-8a33-e029d86b258b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.451321 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd6f01ba-ec4b-4b98-8a33-e029d86b258b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.594554 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 05:43:33 crc kubenswrapper[4956]: I0930 05:43:33.779724 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 05:43:41 crc kubenswrapper[4956]: W0930 05:43:41.211532 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82458356_9089_4c9d_a672_746eb618af3d.slice/crio-af2af21c35ec3512e6eef7c7bcd1ec24ea2ceaa11a42dcd35eb8dee638444295 WatchSource:0}: Error finding container af2af21c35ec3512e6eef7c7bcd1ec24ea2ceaa11a42dcd35eb8dee638444295: Status 404 returned error can't find the container with id af2af21c35ec3512e6eef7c7bcd1ec24ea2ceaa11a42dcd35eb8dee638444295 Sep 30 05:43:41 crc kubenswrapper[4956]: I0930 05:43:41.425949 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"82458356-9089-4c9d-a672-746eb618af3d","Type":"ContainerStarted","Data":"af2af21c35ec3512e6eef7c7bcd1ec24ea2ceaa11a42dcd35eb8dee638444295"} Sep 30 05:43:41 crc kubenswrapper[4956]: I0930 05:43:41.687126 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r7wfs"] Sep 30 05:43:41 crc kubenswrapper[4956]: I0930 05:43:41.706060 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p9lvd"] Sep 30 05:43:42 crc kubenswrapper[4956]: W0930 05:43:42.383552 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod611676cd_11d2_44c4_bae2_41b6b22f898d.slice/crio-7ee329c77c7bc12672c95765441ce2a482413dcb043b129ff09b4f82a91105a7 WatchSource:0}: Error finding container 7ee329c77c7bc12672c95765441ce2a482413dcb043b129ff09b4f82a91105a7: Status 404 returned error can't find the container with id 7ee329c77c7bc12672c95765441ce2a482413dcb043b129ff09b4f82a91105a7 Sep 30 05:43:42 crc kubenswrapper[4956]: W0930 05:43:42.387386 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf61a8fc3_4802_4dfb_b17e_fd4b8db1b863.slice/crio-19c3f2a09b554a93e9eefe778a43597d9243c5e01ea7c6204f596d1c637e7577 WatchSource:0}: Error finding container 19c3f2a09b554a93e9eefe778a43597d9243c5e01ea7c6204f596d1c637e7577: Status 404 returned error can't find the container with id 19c3f2a09b554a93e9eefe778a43597d9243c5e01ea7c6204f596d1c637e7577 Sep 30 05:43:42 crc kubenswrapper[4956]: E0930 05:43:42.388470 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Sep 30 05:43:42 crc kubenswrapper[4956]: E0930 05:43:42.388496 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Sep 30 05:43:42 crc kubenswrapper[4956]: E0930 05:43:42.388583 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lvsss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(5994e7b3-410f-47d8-9aa1-ddd019b123ec): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 05:43:42 crc kubenswrapper[4956]: E0930 05:43:42.389665 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="5994e7b3-410f-47d8-9aa1-ddd019b123ec" Sep 30 05:43:42 crc kubenswrapper[4956]: I0930 05:43:42.433398 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs" event={"ID":"611676cd-11d2-44c4-bae2-41b6b22f898d","Type":"ContainerStarted","Data":"7ee329c77c7bc12672c95765441ce2a482413dcb043b129ff09b4f82a91105a7"} Sep 30 05:43:42 crc kubenswrapper[4956]: I0930 05:43:42.435055 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9lvd" event={"ID":"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863","Type":"ContainerStarted","Data":"19c3f2a09b554a93e9eefe778a43597d9243c5e01ea7c6204f596d1c637e7577"} Sep 30 05:43:42 crc kubenswrapper[4956]: E0930 05:43:42.439089 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb\\\"\"" pod="openstack/kube-state-metrics-0" podUID="5994e7b3-410f-47d8-9aa1-ddd019b123ec" Sep 30 05:43:48 crc kubenswrapper[4956]: I0930 05:43:48.073370 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:43:48 crc kubenswrapper[4956]: I0930 05:43:48.073738 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.618772 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876: Get \"https://quay.rdoproject.org/v2/podified-master-centos10/openstack-ovn-controller/blobs/sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876\": context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.619270 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = reading blob sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876: Get \"https://quay.rdoproject.org/v2/podified-master-centos10/openstack-ovn-controller/blobs/sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876\": context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.619405 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n588h568h697h694h65bh58ch55dh54dh87h557h58ch6dh557h5cfh588hd5h8chc5h568h587h5c9h67h9ch5dch74h5cchd4hcbh9fh5d8h556h697q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fndx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-r7wfs_openstack(611676cd-11d2-44c4-bae2-41b6b22f898d): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876: Get \"https://quay.rdoproject.org/v2/podified-master-centos10/openstack-ovn-controller/blobs/sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876\": context canceled" logger="UnhandledError" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.620622 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876: Get \\\"https://quay.rdoproject.org/v2/podified-master-centos10/openstack-ovn-controller/blobs/sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876\\\": context canceled\"" pod="openstack/ovn-controller-r7wfs" podUID="611676cd-11d2-44c4-bae2-41b6b22f898d" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.657332 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876: Get \"https://quay.rdoproject.org/v2/podified-master-centos10/openstack-ovn-nb-db-server/blobs/sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876\": context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.657397 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = reading blob sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876: Get \"https://quay.rdoproject.org/v2/podified-master-centos10/openstack-ovn-nb-db-server/blobs/sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876\": context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.657555 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n558hc8h585h8bh656h5bbh65ch5cdh84h554h644h5d4h5b4h5d4h58dh587h54hfhffh685h5c4h5dfh687h5c6h564h56fh64dh77hcdhf5h559hfbq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5xwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(82458356-9089-4c9d-a672-746eb618af3d): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876: Get \"https://quay.rdoproject.org/v2/podified-master-centos10/openstack-ovn-nb-db-server/blobs/sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876\": context canceled" logger="UnhandledError" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.662789 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.662858 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.663058 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6j2s4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(0ae54b47-b5ac-43a0-9752-797d2f81ff29): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.664609 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.667958 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.668012 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.668187 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dpwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-notifications-server-0_openstack(de3a8c94-71b5-4948-9079-cc7009b9a8ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:43:55 crc kubenswrapper[4956]: E0930 05:43:55.670986 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-notifications-server-0" podUID="de3a8c94-71b5-4948-9079-cc7009b9a8ea" Sep 30 05:43:56 crc kubenswrapper[4956]: I0930 05:43:56.098168 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.440077 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.440148 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.440255 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c7h56dh5cfh8bh54fhbbhf4h5b9hdch67fhd7h55fh55fh6ch9h548h54ch665h647h6h8fhd6h5dfh5cdh58bh577h66fh695h5fbh55h77h5fcq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n78sp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5449989c59-lmpj4_openstack(093f99c7-9f50-497d-b89c-33887663166b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.441411 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5449989c59-lmpj4" podUID="093f99c7-9f50-497d-b89c-33887663166b" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.442431 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.442486 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.442600 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j9zsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86b8f4ff9-vpm8h_openstack(505e587f-c235-464f-9bea-d5ae3f8fb219): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.444391 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" podUID="505e587f-c235-464f-9bea-d5ae3f8fb219" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.541920 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current\\\"\"" pod="openstack/ovn-controller-r7wfs" podUID="611676cd-11d2-44c4-bae2-41b6b22f898d" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.543253 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current\\\"\"" pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" podUID="505e587f-c235-464f-9bea-d5ae3f8fb219" Sep 30 05:43:56 crc kubenswrapper[4956]: E0930 05:43:56.543964 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current\\\"\"" pod="openstack/dnsmasq-dns-5449989c59-lmpj4" podUID="093f99c7-9f50-497d-b89c-33887663166b" Sep 30 05:43:57 crc kubenswrapper[4956]: E0930 05:43:57.437622 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-base:current" Sep 30 05:43:57 crc kubenswrapper[4956]: E0930 05:43:57.437680 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-base:current" Sep 30 05:43:57 crc kubenswrapper[4956]: E0930 05:43:57.438882 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-base:current,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n588h568h697h694h65bh58ch55dh54dh87h557h58ch6dh557h5cfh588hd5h8chc5h568h587h5c9h67h9ch5dch74h5cchd4hcbh9fh5d8h556h697q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2k8d8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-p9lvd_openstack(f61a8fc3-4802-4dfb-b17e-fd4b8db1b863): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:43:57 crc kubenswrapper[4956]: E0930 05:43:57.440096 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-p9lvd" podUID="f61a8fc3-4802-4dfb-b17e-fd4b8db1b863" Sep 30 05:43:57 crc kubenswrapper[4956]: E0930 05:43:57.547819 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-base:current\\\"\"" pod="openstack/ovn-controller-ovs-p9lvd" podUID="f61a8fc3-4802-4dfb-b17e-fd4b8db1b863" Sep 30 05:43:58 crc kubenswrapper[4956]: W0930 05:43:58.564528 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd6f01ba_ec4b_4b98_8a33_e029d86b258b.slice/crio-dc2404191301b5dba004a84416db6323dd11d5a2a90f61ba2ceaa9fe2e72d82c WatchSource:0}: Error finding container dc2404191301b5dba004a84416db6323dd11d5a2a90f61ba2ceaa9fe2e72d82c: Status 404 returned error can't find the container with id dc2404191301b5dba004a84416db6323dd11d5a2a90f61ba2ceaa9fe2e72d82c Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.638683 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.638735 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.638841 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-588t9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8468885bfc-7qh59_openstack(bdac8ebb-8d05-4766-9f09-0d6084d2168f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.640063 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8468885bfc-7qh59" podUID="bdac8ebb-8d05-4766-9f09-0d6084d2168f" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.674317 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.674373 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.674483 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g525k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-545d49fd5c-sxs9t_openstack(0476d8b4-b1a2-4575-8e7e-363da630e6c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.675755 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" podUID="0476d8b4-b1a2-4575-8e7e-363da630e6c3" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.694198 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.694252 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.694363 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6dq2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b9b4959cc-bv9sv_openstack(848f53ca-813b-4f49-ba68-8d7a22732438): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:43:58 crc kubenswrapper[4956]: E0930 05:43:58.695560 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" podUID="848f53ca-813b-4f49-ba68-8d7a22732438" Sep 30 05:43:59 crc kubenswrapper[4956]: I0930 05:43:59.562296 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4","Type":"ContainerStarted","Data":"1dd46384bd89f5a02cbb07228ce6eb6563ced7e59f1d12f3cc83c531865ff7de"} Sep 30 05:43:59 crc kubenswrapper[4956]: I0930 05:43:59.573348 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"541322a8-d098-4331-ab5b-500262d4655c","Type":"ContainerStarted","Data":"70dca1d04711684976e8bff4998509ea24c6ca1ffe54efc06373a52014c1cd9d"} Sep 30 05:43:59 crc kubenswrapper[4956]: I0930 05:43:59.574961 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b5126f1e-65cb-4032-9df3-8cd061c43253","Type":"ContainerStarted","Data":"75c277d6a4fa649ecfc7fc96e8e929783375071cbf90be4e790739a21202aa75"} Sep 30 05:43:59 crc kubenswrapper[4956]: I0930 05:43:59.575050 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 05:43:59 crc kubenswrapper[4956]: I0930 05:43:59.575811 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd6f01ba-ec4b-4b98-8a33-e029d86b258b","Type":"ContainerStarted","Data":"dc2404191301b5dba004a84416db6323dd11d5a2a90f61ba2ceaa9fe2e72d82c"} Sep 30 05:43:59 crc kubenswrapper[4956]: I0930 05:43:59.630365 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.5323915399999999 podStartE2EDuration="35.630332823s" podCreationTimestamp="2025-09-30 05:43:24 +0000 UTC" firstStartedPulling="2025-09-30 05:43:25.1002038 +0000 UTC m=+875.427324325" lastFinishedPulling="2025-09-30 05:43:59.198145083 +0000 UTC m=+909.525265608" observedRunningTime="2025-09-30 05:43:59.62606748 +0000 UTC m=+909.953188015" watchObservedRunningTime="2025-09-30 05:43:59.630332823 +0000 UTC m=+909.957453348" Sep 30 05:44:00 crc kubenswrapper[4956]: I0930 05:44:00.585729 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ae54b47-b5ac-43a0-9752-797d2f81ff29","Type":"ContainerStarted","Data":"a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25"} Sep 30 05:44:00 crc kubenswrapper[4956]: I0930 05:44:00.586924 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0e95031-1e0d-4979-9926-ba52d0208646","Type":"ContainerStarted","Data":"4914039716bba2efb5fbdef5e5832debcfe6eee7fc54d8e5f1dd49e89f54d499"} Sep 30 05:44:00 crc kubenswrapper[4956]: I0930 05:44:00.588786 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"de3a8c94-71b5-4948-9079-cc7009b9a8ea","Type":"ContainerStarted","Data":"07f92de04e4e412abde86722297c918552f19606b53452424118780ea3a7b636"} Sep 30 05:44:00 crc kubenswrapper[4956]: I0930 05:44:00.591450 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5994e7b3-410f-47d8-9aa1-ddd019b123ec","Type":"ContainerStarted","Data":"0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700"} Sep 30 05:44:00 crc kubenswrapper[4956]: I0930 05:44:00.629005 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.487241581 podStartE2EDuration="35.628988584s" podCreationTimestamp="2025-09-30 05:43:25 +0000 UTC" firstStartedPulling="2025-09-30 05:43:27.056624216 +0000 UTC m=+877.383744741" lastFinishedPulling="2025-09-30 05:43:59.198371229 +0000 UTC m=+909.525491744" observedRunningTime="2025-09-30 05:44:00.620572661 +0000 UTC m=+910.947693176" watchObservedRunningTime="2025-09-30 05:44:00.628988584 +0000 UTC m=+910.956109109" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.220467 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.227024 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.326237 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-588t9\" (UniqueName: \"kubernetes.io/projected/bdac8ebb-8d05-4766-9f09-0d6084d2168f-kube-api-access-588t9\") pod \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\" (UID: \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\") " Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.326302 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-dns-svc\") pod \"848f53ca-813b-4f49-ba68-8d7a22732438\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.326363 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdac8ebb-8d05-4766-9f09-0d6084d2168f-config\") pod \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\" (UID: \"bdac8ebb-8d05-4766-9f09-0d6084d2168f\") " Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.326432 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-config\") pod \"848f53ca-813b-4f49-ba68-8d7a22732438\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.326510 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dq2b\" (UniqueName: \"kubernetes.io/projected/848f53ca-813b-4f49-ba68-8d7a22732438-kube-api-access-6dq2b\") pod \"848f53ca-813b-4f49-ba68-8d7a22732438\" (UID: \"848f53ca-813b-4f49-ba68-8d7a22732438\") " Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.327734 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-config" (OuterVolumeSpecName: "config") pod "848f53ca-813b-4f49-ba68-8d7a22732438" (UID: "848f53ca-813b-4f49-ba68-8d7a22732438"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.327745 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "848f53ca-813b-4f49-ba68-8d7a22732438" (UID: "848f53ca-813b-4f49-ba68-8d7a22732438"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.327741 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdac8ebb-8d05-4766-9f09-0d6084d2168f-config" (OuterVolumeSpecName: "config") pod "bdac8ebb-8d05-4766-9f09-0d6084d2168f" (UID: "bdac8ebb-8d05-4766-9f09-0d6084d2168f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.331675 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848f53ca-813b-4f49-ba68-8d7a22732438-kube-api-access-6dq2b" (OuterVolumeSpecName: "kube-api-access-6dq2b") pod "848f53ca-813b-4f49-ba68-8d7a22732438" (UID: "848f53ca-813b-4f49-ba68-8d7a22732438"). InnerVolumeSpecName "kube-api-access-6dq2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.331826 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdac8ebb-8d05-4766-9f09-0d6084d2168f-kube-api-access-588t9" (OuterVolumeSpecName: "kube-api-access-588t9") pod "bdac8ebb-8d05-4766-9f09-0d6084d2168f" (UID: "bdac8ebb-8d05-4766-9f09-0d6084d2168f"). InnerVolumeSpecName "kube-api-access-588t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.391885 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.428314 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdac8ebb-8d05-4766-9f09-0d6084d2168f-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.428342 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.428351 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dq2b\" (UniqueName: \"kubernetes.io/projected/848f53ca-813b-4f49-ba68-8d7a22732438-kube-api-access-6dq2b\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.428359 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-588t9\" (UniqueName: \"kubernetes.io/projected/bdac8ebb-8d05-4766-9f09-0d6084d2168f-kube-api-access-588t9\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.428367 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/848f53ca-813b-4f49-ba68-8d7a22732438-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.529803 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g525k\" (UniqueName: \"kubernetes.io/projected/0476d8b4-b1a2-4575-8e7e-363da630e6c3-kube-api-access-g525k\") pod \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.529908 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-config\") pod \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.529983 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-dns-svc\") pod \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\" (UID: \"0476d8b4-b1a2-4575-8e7e-363da630e6c3\") " Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.530558 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-config" (OuterVolumeSpecName: "config") pod "0476d8b4-b1a2-4575-8e7e-363da630e6c3" (UID: "0476d8b4-b1a2-4575-8e7e-363da630e6c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.530593 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0476d8b4-b1a2-4575-8e7e-363da630e6c3" (UID: "0476d8b4-b1a2-4575-8e7e-363da630e6c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.534530 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0476d8b4-b1a2-4575-8e7e-363da630e6c3-kube-api-access-g525k" (OuterVolumeSpecName: "kube-api-access-g525k") pod "0476d8b4-b1a2-4575-8e7e-363da630e6c3" (UID: "0476d8b4-b1a2-4575-8e7e-363da630e6c3"). InnerVolumeSpecName "kube-api-access-g525k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.601539 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" event={"ID":"848f53ca-813b-4f49-ba68-8d7a22732438","Type":"ContainerDied","Data":"b6d873318d8a81b4a23e8e5426b0896c875ef93f395aa5b2333b41c80ec18172"} Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.601577 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9b4959cc-bv9sv" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.602630 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" event={"ID":"0476d8b4-b1a2-4575-8e7e-363da630e6c3","Type":"ContainerDied","Data":"d5ec7611b14ee7ce301e12c665093eed9d6c2197c5bdb9a061358cf5d5b0a42f"} Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.602698 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-sxs9t" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.606082 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerStarted","Data":"d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf"} Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.607758 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-7qh59" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.607787 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8468885bfc-7qh59" event={"ID":"bdac8ebb-8d05-4766-9f09-0d6084d2168f","Type":"ContainerDied","Data":"b8b5d65c8d744b2336bf8cfd50b7a6b8b038500c5d00d5b04084235f8a018e85"} Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.631819 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g525k\" (UniqueName: \"kubernetes.io/projected/0476d8b4-b1a2-4575-8e7e-363da630e6c3-kube-api-access-g525k\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.631849 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.631861 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0476d8b4-b1a2-4575-8e7e-363da630e6c3-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.693378 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-sxs9t"] Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.700958 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-sxs9t"] Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.762361 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-bv9sv"] Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.774264 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9b4959cc-bv9sv"] Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.789217 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-7qh59"] Sep 30 05:44:01 crc kubenswrapper[4956]: I0930 05:44:01.793959 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-7qh59"] Sep 30 05:44:01 crc kubenswrapper[4956]: E0930 05:44:01.810921 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876: Get \\\"https://quay.rdoproject.org/v2/podified-master-centos10/openstack-ovn-nb-db-server/blobs/sha256:a8e3bb0d956443cd6a44b7b55b218816f2190c140bb5511a12f36b9f564b5876\\\": context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="82458356-9089-4c9d-a672-746eb618af3d" Sep 30 05:44:02 crc kubenswrapper[4956]: I0930 05:44:02.350590 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0476d8b4-b1a2-4575-8e7e-363da630e6c3" path="/var/lib/kubelet/pods/0476d8b4-b1a2-4575-8e7e-363da630e6c3/volumes" Sep 30 05:44:02 crc kubenswrapper[4956]: I0930 05:44:02.350940 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848f53ca-813b-4f49-ba68-8d7a22732438" path="/var/lib/kubelet/pods/848f53ca-813b-4f49-ba68-8d7a22732438/volumes" Sep 30 05:44:02 crc kubenswrapper[4956]: I0930 05:44:02.351306 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdac8ebb-8d05-4766-9f09-0d6084d2168f" path="/var/lib/kubelet/pods/bdac8ebb-8d05-4766-9f09-0d6084d2168f/volumes" Sep 30 05:44:02 crc kubenswrapper[4956]: I0930 05:44:02.618642 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"82458356-9089-4c9d-a672-746eb618af3d","Type":"ContainerStarted","Data":"ffaedabbc7c07e73df5b2d0b0113e95bf7d3d05074b45ef92568a67fe29987b2"} Sep 30 05:44:02 crc kubenswrapper[4956]: E0930 05:44:02.620982 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="82458356-9089-4c9d-a672-746eb618af3d" Sep 30 05:44:02 crc kubenswrapper[4956]: I0930 05:44:02.621093 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd6f01ba-ec4b-4b98-8a33-e029d86b258b","Type":"ContainerStarted","Data":"0c4d5a5d5aed543d93bb6cdb3b1bf3873b9d61db8da71c6cb51b9404f9248f73"} Sep 30 05:44:02 crc kubenswrapper[4956]: I0930 05:44:02.621161 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd6f01ba-ec4b-4b98-8a33-e029d86b258b","Type":"ContainerStarted","Data":"4e9ec610eb7172813bf164b78065380688d6713353a348f135f3151bf9418bf5"} Sep 30 05:44:02 crc kubenswrapper[4956]: I0930 05:44:02.664939 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=27.69661198 podStartE2EDuration="30.664910026s" podCreationTimestamp="2025-09-30 05:43:32 +0000 UTC" firstStartedPulling="2025-09-30 05:43:58.575038733 +0000 UTC m=+908.902159258" lastFinishedPulling="2025-09-30 05:44:01.543336769 +0000 UTC m=+911.870457304" observedRunningTime="2025-09-30 05:44:02.652434807 +0000 UTC m=+912.979555372" watchObservedRunningTime="2025-09-30 05:44:02.664910026 +0000 UTC m=+912.992030591" Sep 30 05:44:03 crc kubenswrapper[4956]: I0930 05:44:03.594797 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 05:44:03 crc kubenswrapper[4956]: I0930 05:44:03.595181 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 05:44:03 crc kubenswrapper[4956]: E0930 05:44:03.630953 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="82458356-9089-4c9d-a672-746eb618af3d" Sep 30 05:44:04 crc kubenswrapper[4956]: I0930 05:44:04.490384 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 05:44:04 crc kubenswrapper[4956]: I0930 05:44:04.637612 4956 generic.go:334] "Generic (PLEG): container finished" podID="541322a8-d098-4331-ab5b-500262d4655c" containerID="70dca1d04711684976e8bff4998509ea24c6ca1ffe54efc06373a52014c1cd9d" exitCode=0 Sep 30 05:44:04 crc kubenswrapper[4956]: I0930 05:44:04.637692 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"541322a8-d098-4331-ab5b-500262d4655c","Type":"ContainerDied","Data":"70dca1d04711684976e8bff4998509ea24c6ca1ffe54efc06373a52014c1cd9d"} Sep 30 05:44:05 crc kubenswrapper[4956]: I0930 05:44:05.647376 4956 generic.go:334] "Generic (PLEG): container finished" podID="e0e95031-1e0d-4979-9926-ba52d0208646" containerID="4914039716bba2efb5fbdef5e5832debcfe6eee7fc54d8e5f1dd49e89f54d499" exitCode=0 Sep 30 05:44:05 crc kubenswrapper[4956]: I0930 05:44:05.647470 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0e95031-1e0d-4979-9926-ba52d0208646","Type":"ContainerDied","Data":"4914039716bba2efb5fbdef5e5832debcfe6eee7fc54d8e5f1dd49e89f54d499"} Sep 30 05:44:05 crc kubenswrapper[4956]: I0930 05:44:05.649717 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"541322a8-d098-4331-ab5b-500262d4655c","Type":"ContainerStarted","Data":"2902e6b6ce6ce08fe2d8a3aa0560858776acbcb7890d38b593155cbc91360362"} Sep 30 05:44:05 crc kubenswrapper[4956]: I0930 05:44:05.708339 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.338692986 podStartE2EDuration="43.70820889s" podCreationTimestamp="2025-09-30 05:43:22 +0000 UTC" firstStartedPulling="2025-09-30 05:43:24.82691699 +0000 UTC m=+875.154037505" lastFinishedPulling="2025-09-30 05:43:59.196432884 +0000 UTC m=+909.523553409" observedRunningTime="2025-09-30 05:44:05.70753192 +0000 UTC m=+916.034652465" watchObservedRunningTime="2025-09-30 05:44:05.70820889 +0000 UTC m=+916.035329415" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.304551 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.308217 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.420657 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lmpj4"] Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.487396 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-547b55867-zks9n"] Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.495292 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.535800 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-dns-svc\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.536054 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-config\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.536191 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7b7d\" (UniqueName: \"kubernetes.io/projected/5ed861ae-a771-4609-9d1b-1c1b17a28d62-kube-api-access-b7b7d\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.541978 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547b55867-zks9n"] Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.639906 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-dns-svc\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.639948 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-config\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.640010 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7b7d\" (UniqueName: \"kubernetes.io/projected/5ed861ae-a771-4609-9d1b-1c1b17a28d62-kube-api-access-b7b7d\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.641170 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-dns-svc\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.641698 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-config\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.657867 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.661935 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0e95031-1e0d-4979-9926-ba52d0208646","Type":"ContainerStarted","Data":"b40e7e3a76abc8c2870420bf05bca53f8ef9b2a250bee43a2770148b2dcbb23e"} Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.686256 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7b7d\" (UniqueName: \"kubernetes.io/projected/5ed861ae-a771-4609-9d1b-1c1b17a28d62-kube-api-access-b7b7d\") pod \"dnsmasq-dns-547b55867-zks9n\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.695584 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.739677 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=10.517424706 podStartE2EDuration="44.739658264s" podCreationTimestamp="2025-09-30 05:43:22 +0000 UTC" firstStartedPulling="2025-09-30 05:43:24.97609606 +0000 UTC m=+875.303216585" lastFinishedPulling="2025-09-30 05:43:59.198329628 +0000 UTC m=+909.525450143" observedRunningTime="2025-09-30 05:44:06.716415635 +0000 UTC m=+917.043536150" watchObservedRunningTime="2025-09-30 05:44:06.739658264 +0000 UTC m=+917.066778789" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.834444 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.953428 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-vpm8h"] Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.990604 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757cddf575-wxj5r"] Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.991949 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:06 crc kubenswrapper[4956]: I0930 05:44:06.996361 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.016093 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757cddf575-wxj5r"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.021989 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.056845 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-dns-svc\") pod \"093f99c7-9f50-497d-b89c-33887663166b\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.056920 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-config\") pod \"093f99c7-9f50-497d-b89c-33887663166b\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.057063 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n78sp\" (UniqueName: \"kubernetes.io/projected/093f99c7-9f50-497d-b89c-33887663166b-kube-api-access-n78sp\") pod \"093f99c7-9f50-497d-b89c-33887663166b\" (UID: \"093f99c7-9f50-497d-b89c-33887663166b\") " Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.057610 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-config\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.057623 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "093f99c7-9f50-497d-b89c-33887663166b" (UID: "093f99c7-9f50-497d-b89c-33887663166b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.057824 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-dns-svc\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.058034 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxx77\" (UniqueName: \"kubernetes.io/projected/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-kube-api-access-gxx77\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.058141 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-ovsdbserver-sb\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.058288 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.063451 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-config" (OuterVolumeSpecName: "config") pod "093f99c7-9f50-497d-b89c-33887663166b" (UID: "093f99c7-9f50-497d-b89c-33887663166b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.092313 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093f99c7-9f50-497d-b89c-33887663166b-kube-api-access-n78sp" (OuterVolumeSpecName: "kube-api-access-n78sp") pod "093f99c7-9f50-497d-b89c-33887663166b" (UID: "093f99c7-9f50-497d-b89c-33887663166b"). InnerVolumeSpecName "kube-api-access-n78sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.117005 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hfrcl"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.120242 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.122678 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.126055 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hfrcl"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.169974 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/351735b0-6497-4a6c-9562-1ad2785ead5f-config\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170049 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxx77\" (UniqueName: \"kubernetes.io/projected/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-kube-api-access-gxx77\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170122 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtcm5\" (UniqueName: \"kubernetes.io/projected/351735b0-6497-4a6c-9562-1ad2785ead5f-kube-api-access-dtcm5\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170149 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/351735b0-6497-4a6c-9562-1ad2785ead5f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170188 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-ovsdbserver-sb\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170293 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-config\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170336 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/351735b0-6497-4a6c-9562-1ad2785ead5f-ovs-rundir\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170360 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351735b0-6497-4a6c-9562-1ad2785ead5f-combined-ca-bundle\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170489 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/351735b0-6497-4a6c-9562-1ad2785ead5f-ovn-rundir\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170557 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-dns-svc\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170629 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093f99c7-9f50-497d-b89c-33887663166b-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.170644 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n78sp\" (UniqueName: \"kubernetes.io/projected/093f99c7-9f50-497d-b89c-33887663166b-kube-api-access-n78sp\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.172366 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-dns-svc\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.177403 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-config\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.178066 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-ovsdbserver-sb\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.197807 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxx77\" (UniqueName: \"kubernetes.io/projected/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-kube-api-access-gxx77\") pod \"dnsmasq-dns-757cddf575-wxj5r\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.271697 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/351735b0-6497-4a6c-9562-1ad2785ead5f-ovs-rundir\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.271739 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351735b0-6497-4a6c-9562-1ad2785ead5f-combined-ca-bundle\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.271768 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/351735b0-6497-4a6c-9562-1ad2785ead5f-ovn-rundir\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.271810 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/351735b0-6497-4a6c-9562-1ad2785ead5f-config\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.271847 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtcm5\" (UniqueName: \"kubernetes.io/projected/351735b0-6497-4a6c-9562-1ad2785ead5f-kube-api-access-dtcm5\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.271867 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/351735b0-6497-4a6c-9562-1ad2785ead5f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.272330 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/351735b0-6497-4a6c-9562-1ad2785ead5f-ovn-rundir\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.272343 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/351735b0-6497-4a6c-9562-1ad2785ead5f-ovs-rundir\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.273005 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/351735b0-6497-4a6c-9562-1ad2785ead5f-config\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.280715 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/351735b0-6497-4a6c-9562-1ad2785ead5f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.281635 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351735b0-6497-4a6c-9562-1ad2785ead5f-combined-ca-bundle\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.292595 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtcm5\" (UniqueName: \"kubernetes.io/projected/351735b0-6497-4a6c-9562-1ad2785ead5f-kube-api-access-dtcm5\") pod \"ovn-controller-metrics-hfrcl\" (UID: \"351735b0-6497-4a6c-9562-1ad2785ead5f\") " pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.356667 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.365541 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547b55867-zks9n"] Sep 30 05:44:07 crc kubenswrapper[4956]: W0930 05:44:07.373404 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ed861ae_a771_4609_9d1b_1c1b17a28d62.slice/crio-80eebeab9ecc9d05c7593bf102cb81b5f1f84a599deb8a0e232f7cda4846385b WatchSource:0}: Error finding container 80eebeab9ecc9d05c7593bf102cb81b5f1f84a599deb8a0e232f7cda4846385b: Status 404 returned error can't find the container with id 80eebeab9ecc9d05c7593bf102cb81b5f1f84a599deb8a0e232f7cda4846385b Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.374669 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.438521 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hfrcl" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.450168 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547b55867-zks9n"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.481775 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9zsj\" (UniqueName: \"kubernetes.io/projected/505e587f-c235-464f-9bea-d5ae3f8fb219-kube-api-access-j9zsj\") pod \"505e587f-c235-464f-9bea-d5ae3f8fb219\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.481832 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-config\") pod \"505e587f-c235-464f-9bea-d5ae3f8fb219\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.481871 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-dns-svc\") pod \"505e587f-c235-464f-9bea-d5ae3f8fb219\" (UID: \"505e587f-c235-464f-9bea-d5ae3f8fb219\") " Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.482691 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "505e587f-c235-464f-9bea-d5ae3f8fb219" (UID: "505e587f-c235-464f-9bea-d5ae3f8fb219"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.482926 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-config" (OuterVolumeSpecName: "config") pod "505e587f-c235-464f-9bea-d5ae3f8fb219" (UID: "505e587f-c235-464f-9bea-d5ae3f8fb219"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.483289 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.483306 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/505e587f-c235-464f-9bea-d5ae3f8fb219-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.485325 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505e587f-c235-464f-9bea-d5ae3f8fb219-kube-api-access-j9zsj" (OuterVolumeSpecName: "kube-api-access-j9zsj") pod "505e587f-c235-464f-9bea-d5ae3f8fb219" (UID: "505e587f-c235-464f-9bea-d5ae3f8fb219"). InnerVolumeSpecName "kube-api-access-j9zsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.492879 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-f688r"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.494976 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.501488 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.509582 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-f688r"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.585541 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79q5c\" (UniqueName: \"kubernetes.io/projected/44991506-83b3-4752-8cc3-405ab0347a3f-kube-api-access-79q5c\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.585581 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.585605 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-config\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.585651 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.585721 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.585764 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9zsj\" (UniqueName: \"kubernetes.io/projected/505e587f-c235-464f-9bea-d5ae3f8fb219-kube-api-access-j9zsj\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.619730 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.629513 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.632007 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.632305 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.632431 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.632621 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-q27pf" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.648777 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.672095 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547b55867-zks9n" event={"ID":"5ed861ae-a771-4609-9d1b-1c1b17a28d62","Type":"ContainerStarted","Data":"80eebeab9ecc9d05c7593bf102cb81b5f1f84a599deb8a0e232f7cda4846385b"} Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.674878 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" event={"ID":"505e587f-c235-464f-9bea-d5ae3f8fb219","Type":"ContainerDied","Data":"380ea132b18fe1a00d58feef573a39a47c70675aa1e7510bcd24bec285d38a45"} Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.674950 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b8f4ff9-vpm8h" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.677105 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5449989c59-lmpj4" event={"ID":"093f99c7-9f50-497d-b89c-33887663166b","Type":"ContainerDied","Data":"1b415c165fc4546d286cb388fdb4e92b9842a8072113f9d8edf48c1622e8afb1"} Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.677900 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-lmpj4" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.684101 4956 generic.go:334] "Generic (PLEG): container finished" podID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerID="d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf" exitCode=0 Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.684262 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerDied","Data":"d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf"} Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.690596 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.691439 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79q5c\" (UniqueName: \"kubernetes.io/projected/44991506-83b3-4752-8cc3-405ab0347a3f-kube-api-access-79q5c\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.691590 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.691967 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-config\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.692073 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.692481 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.692517 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.692558 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/38f02895-66c4-4da2-b408-838646d7ecbd-lock\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.692588 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/38f02895-66c4-4da2-b408-838646d7ecbd-cache\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.692616 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhsr\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-kube-api-access-cbhsr\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.692940 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.692960 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.693647 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.695962 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-config\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.712727 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79q5c\" (UniqueName: \"kubernetes.io/projected/44991506-83b3-4752-8cc3-405ab0347a3f-kube-api-access-79q5c\") pod \"dnsmasq-dns-76f9c4c8bc-f688r\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.794517 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.794838 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.794867 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/38f02895-66c4-4da2-b408-838646d7ecbd-lock\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.794893 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/38f02895-66c4-4da2-b408-838646d7ecbd-cache\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.794912 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhsr\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-kube-api-access-cbhsr\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: E0930 05:44:07.795889 4956 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 05:44:07 crc kubenswrapper[4956]: E0930 05:44:07.795908 4956 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 05:44:07 crc kubenswrapper[4956]: E0930 05:44:07.795947 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift podName:38f02895-66c4-4da2-b408-838646d7ecbd nodeName:}" failed. No retries permitted until 2025-09-30 05:44:08.295932463 +0000 UTC m=+918.623052988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift") pod "swift-storage-0" (UID: "38f02895-66c4-4da2-b408-838646d7ecbd") : configmap "swift-ring-files" not found Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.796313 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/38f02895-66c4-4da2-b408-838646d7ecbd-cache\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.796344 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/38f02895-66c4-4da2-b408-838646d7ecbd-lock\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.796615 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.819089 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.831374 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhsr\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-kube-api-access-cbhsr\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.839754 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.851707 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lmpj4"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.862168 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-lmpj4"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.879939 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-vpm8h"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.885584 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86b8f4ff9-vpm8h"] Sep 30 05:44:07 crc kubenswrapper[4956]: I0930 05:44:07.942087 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757cddf575-wxj5r"] Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.077579 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hfrcl"] Sep 30 05:44:08 crc kubenswrapper[4956]: W0930 05:44:08.097610 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod351735b0_6497_4a6c_9562_1ad2785ead5f.slice/crio-f4bc2bcc8948a24bdfd3a4e6df29f1503645a1146f291adde4a55449a307e4b1 WatchSource:0}: Error finding container f4bc2bcc8948a24bdfd3a4e6df29f1503645a1146f291adde4a55449a307e4b1: Status 404 returned error can't find the container with id f4bc2bcc8948a24bdfd3a4e6df29f1503645a1146f291adde4a55449a307e4b1 Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.157522 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2zwhw"] Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.158549 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.162725 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.162754 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.162724 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.191232 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2zwhw"] Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.209280 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp96t\" (UniqueName: \"kubernetes.io/projected/547b221a-e3b1-4a31-b09c-07022356f1e9-kube-api-access-qp96t\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.209393 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-swiftconf\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.209453 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/547b221a-e3b1-4a31-b09c-07022356f1e9-etc-swift\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.209483 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-dispersionconf\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.209501 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-ring-data-devices\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.209688 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-scripts\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.209731 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-combined-ca-bundle\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.294997 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-f688r"] Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.317217 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-combined-ca-bundle\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.317695 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:08 crc kubenswrapper[4956]: E0930 05:44:08.317889 4956 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 05:44:08 crc kubenswrapper[4956]: E0930 05:44:08.317981 4956 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 05:44:08 crc kubenswrapper[4956]: E0930 05:44:08.318089 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift podName:38f02895-66c4-4da2-b408-838646d7ecbd nodeName:}" failed. No retries permitted until 2025-09-30 05:44:09.318072099 +0000 UTC m=+919.645192624 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift") pod "swift-storage-0" (UID: "38f02895-66c4-4da2-b408-838646d7ecbd") : configmap "swift-ring-files" not found Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.318017 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp96t\" (UniqueName: \"kubernetes.io/projected/547b221a-e3b1-4a31-b09c-07022356f1e9-kube-api-access-qp96t\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.318460 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-swiftconf\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.318632 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/547b221a-e3b1-4a31-b09c-07022356f1e9-etc-swift\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.318759 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-dispersionconf\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.319128 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-ring-data-devices\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.319318 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-scripts\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.319859 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-ring-data-devices\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.319046 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/547b221a-e3b1-4a31-b09c-07022356f1e9-etc-swift\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.320089 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-scripts\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.322039 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-combined-ca-bundle\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.322359 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-dispersionconf\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.333798 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-swiftconf\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.339562 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp96t\" (UniqueName: \"kubernetes.io/projected/547b221a-e3b1-4a31-b09c-07022356f1e9-kube-api-access-qp96t\") pod \"swift-ring-rebalance-2zwhw\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.353187 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093f99c7-9f50-497d-b89c-33887663166b" path="/var/lib/kubelet/pods/093f99c7-9f50-497d-b89c-33887663166b/volumes" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.353649 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505e587f-c235-464f-9bea-d5ae3f8fb219" path="/var/lib/kubelet/pods/505e587f-c235-464f-9bea-d5ae3f8fb219/volumes" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.482248 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.694066 4956 generic.go:334] "Generic (PLEG): container finished" podID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" containerID="30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada" exitCode=0 Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.694438 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" event={"ID":"77ab71aa-384e-4d3d-a3cc-c5b48d391adf","Type":"ContainerDied","Data":"30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada"} Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.694480 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" event={"ID":"77ab71aa-384e-4d3d-a3cc-c5b48d391adf","Type":"ContainerStarted","Data":"c451c29c4c28bb471b684032b69e490ff8ee78582df0444f2f421545a42bd50b"} Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.699627 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ed861ae-a771-4609-9d1b-1c1b17a28d62" containerID="100b4bfd34c9bf43df8dc3fde1756ef17eb1c9fcdc54215b42c1f0f36c9fd071" exitCode=0 Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.699676 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547b55867-zks9n" event={"ID":"5ed861ae-a771-4609-9d1b-1c1b17a28d62","Type":"ContainerDied","Data":"100b4bfd34c9bf43df8dc3fde1756ef17eb1c9fcdc54215b42c1f0f36c9fd071"} Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.705563 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hfrcl" event={"ID":"351735b0-6497-4a6c-9562-1ad2785ead5f","Type":"ContainerStarted","Data":"9a219fd8e292030019650d551abe880a1a503138ef65d5d6b5ba0ffb9c92f349"} Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.705631 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hfrcl" event={"ID":"351735b0-6497-4a6c-9562-1ad2785ead5f","Type":"ContainerStarted","Data":"f4bc2bcc8948a24bdfd3a4e6df29f1503645a1146f291adde4a55449a307e4b1"} Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.708316 4956 generic.go:334] "Generic (PLEG): container finished" podID="44991506-83b3-4752-8cc3-405ab0347a3f" containerID="6277390e1a01f01097b5d142ff84d6e16ab17d6b73b27122ac7ca1a0b0b9440b" exitCode=0 Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.708366 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" event={"ID":"44991506-83b3-4752-8cc3-405ab0347a3f","Type":"ContainerDied","Data":"6277390e1a01f01097b5d142ff84d6e16ab17d6b73b27122ac7ca1a0b0b9440b"} Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.708388 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" event={"ID":"44991506-83b3-4752-8cc3-405ab0347a3f","Type":"ContainerStarted","Data":"77c906702dd6a1b497a8f0e90a9b720eeb13a7d0bde979da594b4b0d159ddd5c"} Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.743693 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hfrcl" podStartSLOduration=1.743673609 podStartE2EDuration="1.743673609s" podCreationTimestamp="2025-09-30 05:44:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:08.73294226 +0000 UTC m=+919.060062785" watchObservedRunningTime="2025-09-30 05:44:08.743673609 +0000 UTC m=+919.070794134" Sep 30 05:44:08 crc kubenswrapper[4956]: I0930 05:44:08.938667 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2zwhw"] Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.127001 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.237281 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7b7d\" (UniqueName: \"kubernetes.io/projected/5ed861ae-a771-4609-9d1b-1c1b17a28d62-kube-api-access-b7b7d\") pod \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.237350 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-config\") pod \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.237385 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-dns-svc\") pod \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\" (UID: \"5ed861ae-a771-4609-9d1b-1c1b17a28d62\") " Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.242503 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed861ae-a771-4609-9d1b-1c1b17a28d62-kube-api-access-b7b7d" (OuterVolumeSpecName: "kube-api-access-b7b7d") pod "5ed861ae-a771-4609-9d1b-1c1b17a28d62" (UID: "5ed861ae-a771-4609-9d1b-1c1b17a28d62"). InnerVolumeSpecName "kube-api-access-b7b7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.268341 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-config" (OuterVolumeSpecName: "config") pod "5ed861ae-a771-4609-9d1b-1c1b17a28d62" (UID: "5ed861ae-a771-4609-9d1b-1c1b17a28d62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.276430 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ed861ae-a771-4609-9d1b-1c1b17a28d62" (UID: "5ed861ae-a771-4609-9d1b-1c1b17a28d62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.340662 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:09 crc kubenswrapper[4956]: E0930 05:44:09.340830 4956 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 05:44:09 crc kubenswrapper[4956]: E0930 05:44:09.341305 4956 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.341387 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7b7d\" (UniqueName: \"kubernetes.io/projected/5ed861ae-a771-4609-9d1b-1c1b17a28d62-kube-api-access-b7b7d\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.341519 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:09 crc kubenswrapper[4956]: E0930 05:44:09.341494 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift podName:38f02895-66c4-4da2-b408-838646d7ecbd nodeName:}" failed. No retries permitted until 2025-09-30 05:44:11.341479762 +0000 UTC m=+921.668600287 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift") pod "swift-storage-0" (UID: "38f02895-66c4-4da2-b408-838646d7ecbd") : configmap "swift-ring-files" not found Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.341559 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed861ae-a771-4609-9d1b-1c1b17a28d62-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.722626 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" event={"ID":"44991506-83b3-4752-8cc3-405ab0347a3f","Type":"ContainerStarted","Data":"522b4001a1ee49e91d4c452cee6d7d6d57c6eb90ee9796861627746a53a04e7d"} Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.722966 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.724679 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" event={"ID":"77ab71aa-384e-4d3d-a3cc-c5b48d391adf","Type":"ContainerStarted","Data":"8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04"} Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.724737 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.726273 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2zwhw" event={"ID":"547b221a-e3b1-4a31-b09c-07022356f1e9","Type":"ContainerStarted","Data":"f2360b452d90d62c435de75a3be924f4d4c3a0d26670032ec61a9c9e037bfab2"} Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.728019 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547b55867-zks9n" event={"ID":"5ed861ae-a771-4609-9d1b-1c1b17a28d62","Type":"ContainerDied","Data":"80eebeab9ecc9d05c7593bf102cb81b5f1f84a599deb8a0e232f7cda4846385b"} Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.728033 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547b55867-zks9n" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.728052 4956 scope.go:117] "RemoveContainer" containerID="100b4bfd34c9bf43df8dc3fde1756ef17eb1c9fcdc54215b42c1f0f36c9fd071" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.750352 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" podStartSLOduration=2.75033255 podStartE2EDuration="2.75033255s" podCreationTimestamp="2025-09-30 05:44:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:09.738765767 +0000 UTC m=+920.065886322" watchObservedRunningTime="2025-09-30 05:44:09.75033255 +0000 UTC m=+920.077453075" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.795426 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" podStartSLOduration=3.795404336 podStartE2EDuration="3.795404336s" podCreationTimestamp="2025-09-30 05:44:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:09.763526129 +0000 UTC m=+920.090646654" watchObservedRunningTime="2025-09-30 05:44:09.795404336 +0000 UTC m=+920.122524861" Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.808330 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547b55867-zks9n"] Sep 30 05:44:09 crc kubenswrapper[4956]: I0930 05:44:09.809970 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-547b55867-zks9n"] Sep 30 05:44:10 crc kubenswrapper[4956]: E0930 05:44:10.338049 4956 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:53244->38.102.83.82:43469: write tcp 38.102.83.82:53244->38.102.83.82:43469: write: broken pipe Sep 30 05:44:10 crc kubenswrapper[4956]: I0930 05:44:10.357824 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed861ae-a771-4609-9d1b-1c1b17a28d62" path="/var/lib/kubelet/pods/5ed861ae-a771-4609-9d1b-1c1b17a28d62/volumes" Sep 30 05:44:10 crc kubenswrapper[4956]: I0930 05:44:10.750752 4956 generic.go:334] "Generic (PLEG): container finished" podID="f61a8fc3-4802-4dfb-b17e-fd4b8db1b863" containerID="dc10716cefc5eef22679069c31411dbd35197d73dc77cb55d6e2a63f3a214098" exitCode=0 Sep 30 05:44:10 crc kubenswrapper[4956]: I0930 05:44:10.750836 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9lvd" event={"ID":"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863","Type":"ContainerDied","Data":"dc10716cefc5eef22679069c31411dbd35197d73dc77cb55d6e2a63f3a214098"} Sep 30 05:44:11 crc kubenswrapper[4956]: I0930 05:44:11.380982 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:11 crc kubenswrapper[4956]: E0930 05:44:11.381196 4956 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 05:44:11 crc kubenswrapper[4956]: E0930 05:44:11.381373 4956 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 05:44:11 crc kubenswrapper[4956]: E0930 05:44:11.381427 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift podName:38f02895-66c4-4da2-b408-838646d7ecbd nodeName:}" failed. No retries permitted until 2025-09-30 05:44:15.381411999 +0000 UTC m=+925.708532524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift") pod "swift-storage-0" (UID: "38f02895-66c4-4da2-b408-838646d7ecbd") : configmap "swift-ring-files" not found Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.043130 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.043643 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.120878 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.153826 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.153974 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.236482 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.786382 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2zwhw" event={"ID":"547b221a-e3b1-4a31-b09c-07022356f1e9","Type":"ContainerStarted","Data":"c17858db66d349a9461f55db1edf631ff7efef6c3ab8b226618b8fe700994708"} Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.790454 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9lvd" event={"ID":"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863","Type":"ContainerStarted","Data":"14b1e9fbc9d0d72a5c20e6a71e78f55cef532ce196d0f8650f8a95eb12ad8830"} Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.790478 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9lvd" event={"ID":"f61a8fc3-4802-4dfb-b17e-fd4b8db1b863","Type":"ContainerStarted","Data":"597418c7d3a99360085ae750902d0fba5d4652aacf9aff3da6f2e1e74c56940a"} Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.790876 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.790911 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.792779 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerStarted","Data":"0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8"} Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.794488 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs" event={"ID":"611676cd-11d2-44c4-bae2-41b6b22f898d","Type":"ContainerStarted","Data":"0d5ba556bf497ff4181997ef4ee5f40497a6b830fbf4d1ddf92174b0116356d5"} Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.794848 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-r7wfs" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.814597 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2zwhw" podStartSLOduration=2.198485039 podStartE2EDuration="6.814578756s" podCreationTimestamp="2025-09-30 05:44:08 +0000 UTC" firstStartedPulling="2025-09-30 05:44:09.005206481 +0000 UTC m=+919.332327006" lastFinishedPulling="2025-09-30 05:44:13.621300198 +0000 UTC m=+923.948420723" observedRunningTime="2025-09-30 05:44:14.813530106 +0000 UTC m=+925.140650661" watchObservedRunningTime="2025-09-30 05:44:14.814578756 +0000 UTC m=+925.141699281" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.856619 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-p9lvd" podStartSLOduration=18.75243382 podStartE2EDuration="45.856595515s" podCreationTimestamp="2025-09-30 05:43:29 +0000 UTC" firstStartedPulling="2025-09-30 05:43:42.389084292 +0000 UTC m=+892.716204817" lastFinishedPulling="2025-09-30 05:44:09.493245987 +0000 UTC m=+919.820366512" observedRunningTime="2025-09-30 05:44:14.852500377 +0000 UTC m=+925.179620972" watchObservedRunningTime="2025-09-30 05:44:14.856595515 +0000 UTC m=+925.183716060" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.874740 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r7wfs" podStartSLOduration=14.651348794 podStartE2EDuration="45.874722766s" podCreationTimestamp="2025-09-30 05:43:29 +0000 UTC" firstStartedPulling="2025-09-30 05:43:42.38625769 +0000 UTC m=+892.713378215" lastFinishedPulling="2025-09-30 05:44:13.609631662 +0000 UTC m=+923.936752187" observedRunningTime="2025-09-30 05:44:14.868527368 +0000 UTC m=+925.195647923" watchObservedRunningTime="2025-09-30 05:44:14.874722766 +0000 UTC m=+925.201843291" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.899791 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 05:44:14 crc kubenswrapper[4956]: I0930 05:44:14.901313 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 05:44:15 crc kubenswrapper[4956]: I0930 05:44:15.447473 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:15 crc kubenswrapper[4956]: E0930 05:44:15.447554 4956 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 05:44:15 crc kubenswrapper[4956]: E0930 05:44:15.447831 4956 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 05:44:15 crc kubenswrapper[4956]: E0930 05:44:15.447890 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift podName:38f02895-66c4-4da2-b408-838646d7ecbd nodeName:}" failed. No retries permitted until 2025-09-30 05:44:23.447872719 +0000 UTC m=+933.774993324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift") pod "swift-storage-0" (UID: "38f02895-66c4-4da2-b408-838646d7ecbd") : configmap "swift-ring-files" not found Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.547092 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-qm5wt"] Sep 30 05:44:16 crc kubenswrapper[4956]: E0930 05:44:16.547644 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed861ae-a771-4609-9d1b-1c1b17a28d62" containerName="init" Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.547661 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed861ae-a771-4609-9d1b-1c1b17a28d62" containerName="init" Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.547860 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed861ae-a771-4609-9d1b-1c1b17a28d62" containerName="init" Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.548558 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-qm5wt" Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.553478 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-qm5wt"] Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.569623 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lgpg\" (UniqueName: \"kubernetes.io/projected/5a30fe5e-a789-4251-87bb-bba57356337a-kube-api-access-4lgpg\") pod \"watcher-db-create-qm5wt\" (UID: \"5a30fe5e-a789-4251-87bb-bba57356337a\") " pod="openstack/watcher-db-create-qm5wt" Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.672059 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lgpg\" (UniqueName: \"kubernetes.io/projected/5a30fe5e-a789-4251-87bb-bba57356337a-kube-api-access-4lgpg\") pod \"watcher-db-create-qm5wt\" (UID: \"5a30fe5e-a789-4251-87bb-bba57356337a\") " pod="openstack/watcher-db-create-qm5wt" Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.695183 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lgpg\" (UniqueName: \"kubernetes.io/projected/5a30fe5e-a789-4251-87bb-bba57356337a-kube-api-access-4lgpg\") pod \"watcher-db-create-qm5wt\" (UID: \"5a30fe5e-a789-4251-87bb-bba57356337a\") " pod="openstack/watcher-db-create-qm5wt" Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.812706 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerStarted","Data":"af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180"} Sep 30 05:44:16 crc kubenswrapper[4956]: I0930 05:44:16.880173 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-qm5wt" Sep 30 05:44:17 crc kubenswrapper[4956]: I0930 05:44:17.306015 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-qm5wt"] Sep 30 05:44:17 crc kubenswrapper[4956]: W0930 05:44:17.314879 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a30fe5e_a789_4251_87bb_bba57356337a.slice/crio-91059a9503b5a763af84cc6c3469d2400f0c6f1c45ea96078f72de3853d7bfe8 WatchSource:0}: Error finding container 91059a9503b5a763af84cc6c3469d2400f0c6f1c45ea96078f72de3853d7bfe8: Status 404 returned error can't find the container with id 91059a9503b5a763af84cc6c3469d2400f0c6f1c45ea96078f72de3853d7bfe8 Sep 30 05:44:17 crc kubenswrapper[4956]: I0930 05:44:17.358260 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:17 crc kubenswrapper[4956]: I0930 05:44:17.821015 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:17 crc kubenswrapper[4956]: I0930 05:44:17.823863 4956 generic.go:334] "Generic (PLEG): container finished" podID="5a30fe5e-a789-4251-87bb-bba57356337a" containerID="e35bfb8b1e3edbe9c7812f2407e9c810d0b32bcb5dd01a7a4d3fabb9d064bf18" exitCode=0 Sep 30 05:44:17 crc kubenswrapper[4956]: I0930 05:44:17.823894 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-qm5wt" event={"ID":"5a30fe5e-a789-4251-87bb-bba57356337a","Type":"ContainerDied","Data":"e35bfb8b1e3edbe9c7812f2407e9c810d0b32bcb5dd01a7a4d3fabb9d064bf18"} Sep 30 05:44:17 crc kubenswrapper[4956]: I0930 05:44:17.823911 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-qm5wt" event={"ID":"5a30fe5e-a789-4251-87bb-bba57356337a","Type":"ContainerStarted","Data":"91059a9503b5a763af84cc6c3469d2400f0c6f1c45ea96078f72de3853d7bfe8"} Sep 30 05:44:17 crc kubenswrapper[4956]: I0930 05:44:17.889871 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757cddf575-wxj5r"] Sep 30 05:44:17 crc kubenswrapper[4956]: I0930 05:44:17.890127 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" podUID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" containerName="dnsmasq-dns" containerID="cri-o://8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04" gracePeriod=10 Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.073411 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.073462 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.073506 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.075875 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"357cd0f449d0c4885ee791783ca31e326eaaa7e5f1a04d708b2435c26c26f499"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.075938 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://357cd0f449d0c4885ee791783ca31e326eaaa7e5f1a04d708b2435c26c26f499" gracePeriod=600 Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.364648 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.515657 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-dns-svc\") pod \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.515928 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-config\") pod \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.515990 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxx77\" (UniqueName: \"kubernetes.io/projected/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-kube-api-access-gxx77\") pod \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.516077 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-ovsdbserver-sb\") pod \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\" (UID: \"77ab71aa-384e-4d3d-a3cc-c5b48d391adf\") " Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.523240 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-kube-api-access-gxx77" (OuterVolumeSpecName: "kube-api-access-gxx77") pod "77ab71aa-384e-4d3d-a3cc-c5b48d391adf" (UID: "77ab71aa-384e-4d3d-a3cc-c5b48d391adf"). InnerVolumeSpecName "kube-api-access-gxx77". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.567488 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-config" (OuterVolumeSpecName: "config") pod "77ab71aa-384e-4d3d-a3cc-c5b48d391adf" (UID: "77ab71aa-384e-4d3d-a3cc-c5b48d391adf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.569817 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77ab71aa-384e-4d3d-a3cc-c5b48d391adf" (UID: "77ab71aa-384e-4d3d-a3cc-c5b48d391adf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.597229 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "77ab71aa-384e-4d3d-a3cc-c5b48d391adf" (UID: "77ab71aa-384e-4d3d-a3cc-c5b48d391adf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.618579 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.618637 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxx77\" (UniqueName: \"kubernetes.io/projected/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-kube-api-access-gxx77\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.618652 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.618663 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77ab71aa-384e-4d3d-a3cc-c5b48d391adf-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.834318 4956 generic.go:334] "Generic (PLEG): container finished" podID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" containerID="8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04" exitCode=0 Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.834397 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.834390 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" event={"ID":"77ab71aa-384e-4d3d-a3cc-c5b48d391adf","Type":"ContainerDied","Data":"8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04"} Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.835331 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757cddf575-wxj5r" event={"ID":"77ab71aa-384e-4d3d-a3cc-c5b48d391adf","Type":"ContainerDied","Data":"c451c29c4c28bb471b684032b69e490ff8ee78582df0444f2f421545a42bd50b"} Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.835367 4956 scope.go:117] "RemoveContainer" containerID="8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04" Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.839452 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="357cd0f449d0c4885ee791783ca31e326eaaa7e5f1a04d708b2435c26c26f499" exitCode=0 Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.839518 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"357cd0f449d0c4885ee791783ca31e326eaaa7e5f1a04d708b2435c26c26f499"} Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.839586 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"f437f59e0bbde583a02d6871ab0b4a73cd7224d37812609e0ec49e3d0eab2998"} Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.872642 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757cddf575-wxj5r"] Sep 30 05:44:18 crc kubenswrapper[4956]: I0930 05:44:18.878460 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757cddf575-wxj5r"] Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.415958 4956 scope.go:117] "RemoveContainer" containerID="30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.477193 4956 scope.go:117] "RemoveContainer" containerID="8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04" Sep 30 05:44:19 crc kubenswrapper[4956]: E0930 05:44:19.478349 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04\": container with ID starting with 8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04 not found: ID does not exist" containerID="8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.478376 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04"} err="failed to get container status \"8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04\": rpc error: code = NotFound desc = could not find container \"8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04\": container with ID starting with 8877f2f250d1acdf348735416e975d1daa6a2028e55caf3dc47233247d000b04 not found: ID does not exist" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.478397 4956 scope.go:117] "RemoveContainer" containerID="30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada" Sep 30 05:44:19 crc kubenswrapper[4956]: E0930 05:44:19.478764 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada\": container with ID starting with 30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada not found: ID does not exist" containerID="30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.478828 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada"} err="failed to get container status \"30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada\": rpc error: code = NotFound desc = could not find container \"30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada\": container with ID starting with 30cb2f863c1464a207e09afe7952b5fa06a2eb7cc6ebf692fefbf60909aa8ada not found: ID does not exist" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.478857 4956 scope.go:117] "RemoveContainer" containerID="cb7b5906427c53280d2cfba1bc232da8c4f2d2136d0c44bbfff91603d668f7d8" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.501881 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-qm5wt" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.634077 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lgpg\" (UniqueName: \"kubernetes.io/projected/5a30fe5e-a789-4251-87bb-bba57356337a-kube-api-access-4lgpg\") pod \"5a30fe5e-a789-4251-87bb-bba57356337a\" (UID: \"5a30fe5e-a789-4251-87bb-bba57356337a\") " Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.640393 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a30fe5e-a789-4251-87bb-bba57356337a-kube-api-access-4lgpg" (OuterVolumeSpecName: "kube-api-access-4lgpg") pod "5a30fe5e-a789-4251-87bb-bba57356337a" (UID: "5a30fe5e-a789-4251-87bb-bba57356337a"). InnerVolumeSpecName "kube-api-access-4lgpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.736819 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lgpg\" (UniqueName: \"kubernetes.io/projected/5a30fe5e-a789-4251-87bb-bba57356337a-kube-api-access-4lgpg\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.851190 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerStarted","Data":"3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b"} Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.855796 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-qm5wt" event={"ID":"5a30fe5e-a789-4251-87bb-bba57356337a","Type":"ContainerDied","Data":"91059a9503b5a763af84cc6c3469d2400f0c6f1c45ea96078f72de3853d7bfe8"} Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.855830 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91059a9503b5a763af84cc6c3469d2400f0c6f1c45ea96078f72de3853d7bfe8" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.855887 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-qm5wt" Sep 30 05:44:19 crc kubenswrapper[4956]: I0930 05:44:19.879875 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.520691921 podStartE2EDuration="53.879858972s" podCreationTimestamp="2025-09-30 05:43:26 +0000 UTC" firstStartedPulling="2025-09-30 05:43:29.119009439 +0000 UTC m=+879.446129964" lastFinishedPulling="2025-09-30 05:44:19.47817649 +0000 UTC m=+929.805297015" observedRunningTime="2025-09-30 05:44:19.874955541 +0000 UTC m=+930.202076066" watchObservedRunningTime="2025-09-30 05:44:19.879858972 +0000 UTC m=+930.206979487" Sep 30 05:44:20 crc kubenswrapper[4956]: I0930 05:44:20.350742 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" path="/var/lib/kubelet/pods/77ab71aa-384e-4d3d-a3cc-c5b48d391adf/volumes" Sep 30 05:44:20 crc kubenswrapper[4956]: I0930 05:44:20.867965 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"82458356-9089-4c9d-a672-746eb618af3d","Type":"ContainerStarted","Data":"5c23995d240a542574b4ba75cf0ae5dd90f630f725d31027e562b8212fee29c5"} Sep 30 05:44:20 crc kubenswrapper[4956]: I0930 05:44:20.891660 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.076907323 podStartE2EDuration="51.891617699s" podCreationTimestamp="2025-09-30 05:43:29 +0000 UTC" firstStartedPulling="2025-09-30 05:43:41.216316553 +0000 UTC m=+891.543437068" lastFinishedPulling="2025-09-30 05:44:20.031026929 +0000 UTC m=+930.358147444" observedRunningTime="2025-09-30 05:44:20.890780466 +0000 UTC m=+931.217901001" watchObservedRunningTime="2025-09-30 05:44:20.891617699 +0000 UTC m=+931.218738224" Sep 30 05:44:21 crc kubenswrapper[4956]: I0930 05:44:21.255728 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 05:44:21 crc kubenswrapper[4956]: I0930 05:44:21.878684 4956 generic.go:334] "Generic (PLEG): container finished" podID="547b221a-e3b1-4a31-b09c-07022356f1e9" containerID="c17858db66d349a9461f55db1edf631ff7efef6c3ab8b226618b8fe700994708" exitCode=0 Sep 30 05:44:21 crc kubenswrapper[4956]: I0930 05:44:21.878985 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2zwhw" event={"ID":"547b221a-e3b1-4a31-b09c-07022356f1e9","Type":"ContainerDied","Data":"c17858db66d349a9461f55db1edf631ff7efef6c3ab8b226618b8fe700994708"} Sep 30 05:44:22 crc kubenswrapper[4956]: I0930 05:44:22.255286 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 05:44:22 crc kubenswrapper[4956]: I0930 05:44:22.592810 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.307822 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.403664 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-scripts\") pod \"547b221a-e3b1-4a31-b09c-07022356f1e9\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.403738 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-dispersionconf\") pod \"547b221a-e3b1-4a31-b09c-07022356f1e9\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.403762 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-swiftconf\") pod \"547b221a-e3b1-4a31-b09c-07022356f1e9\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.403831 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-ring-data-devices\") pod \"547b221a-e3b1-4a31-b09c-07022356f1e9\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.403887 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-combined-ca-bundle\") pod \"547b221a-e3b1-4a31-b09c-07022356f1e9\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.405195 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "547b221a-e3b1-4a31-b09c-07022356f1e9" (UID: "547b221a-e3b1-4a31-b09c-07022356f1e9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.405238 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp96t\" (UniqueName: \"kubernetes.io/projected/547b221a-e3b1-4a31-b09c-07022356f1e9-kube-api-access-qp96t\") pod \"547b221a-e3b1-4a31-b09c-07022356f1e9\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.405288 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/547b221a-e3b1-4a31-b09c-07022356f1e9-etc-swift\") pod \"547b221a-e3b1-4a31-b09c-07022356f1e9\" (UID: \"547b221a-e3b1-4a31-b09c-07022356f1e9\") " Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.405859 4956 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.407841 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/547b221a-e3b1-4a31-b09c-07022356f1e9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "547b221a-e3b1-4a31-b09c-07022356f1e9" (UID: "547b221a-e3b1-4a31-b09c-07022356f1e9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.409983 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/547b221a-e3b1-4a31-b09c-07022356f1e9-kube-api-access-qp96t" (OuterVolumeSpecName: "kube-api-access-qp96t") pod "547b221a-e3b1-4a31-b09c-07022356f1e9" (UID: "547b221a-e3b1-4a31-b09c-07022356f1e9"). InnerVolumeSpecName "kube-api-access-qp96t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.415628 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "547b221a-e3b1-4a31-b09c-07022356f1e9" (UID: "547b221a-e3b1-4a31-b09c-07022356f1e9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.424418 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-scripts" (OuterVolumeSpecName: "scripts") pod "547b221a-e3b1-4a31-b09c-07022356f1e9" (UID: "547b221a-e3b1-4a31-b09c-07022356f1e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.430775 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "547b221a-e3b1-4a31-b09c-07022356f1e9" (UID: "547b221a-e3b1-4a31-b09c-07022356f1e9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.435489 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "547b221a-e3b1-4a31-b09c-07022356f1e9" (UID: "547b221a-e3b1-4a31-b09c-07022356f1e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.507441 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.507556 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.507571 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp96t\" (UniqueName: \"kubernetes.io/projected/547b221a-e3b1-4a31-b09c-07022356f1e9-kube-api-access-qp96t\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.507584 4956 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/547b221a-e3b1-4a31-b09c-07022356f1e9-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.507596 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547b221a-e3b1-4a31-b09c-07022356f1e9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.507605 4956 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.507615 4956 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/547b221a-e3b1-4a31-b09c-07022356f1e9-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.517549 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38f02895-66c4-4da2-b408-838646d7ecbd-etc-swift\") pod \"swift-storage-0\" (UID: \"38f02895-66c4-4da2-b408-838646d7ecbd\") " pod="openstack/swift-storage-0" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.685567 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.909438 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2zwhw" event={"ID":"547b221a-e3b1-4a31-b09c-07022356f1e9","Type":"ContainerDied","Data":"f2360b452d90d62c435de75a3be924f4d4c3a0d26670032ec61a9c9e037bfab2"} Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.909486 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2360b452d90d62c435de75a3be924f4d4c3a0d26670032ec61a9c9e037bfab2" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.909508 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2zwhw" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.997176 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pspqn"] Sep 30 05:44:23 crc kubenswrapper[4956]: E0930 05:44:23.997788 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" containerName="init" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.997817 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" containerName="init" Sep 30 05:44:23 crc kubenswrapper[4956]: E0930 05:44:23.997838 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547b221a-e3b1-4a31-b09c-07022356f1e9" containerName="swift-ring-rebalance" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.997850 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="547b221a-e3b1-4a31-b09c-07022356f1e9" containerName="swift-ring-rebalance" Sep 30 05:44:23 crc kubenswrapper[4956]: E0930 05:44:23.997878 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a30fe5e-a789-4251-87bb-bba57356337a" containerName="mariadb-database-create" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.997888 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a30fe5e-a789-4251-87bb-bba57356337a" containerName="mariadb-database-create" Sep 30 05:44:23 crc kubenswrapper[4956]: E0930 05:44:23.997911 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" containerName="dnsmasq-dns" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.997921 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" containerName="dnsmasq-dns" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.998253 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ab71aa-384e-4d3d-a3cc-c5b48d391adf" containerName="dnsmasq-dns" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.998285 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a30fe5e-a789-4251-87bb-bba57356337a" containerName="mariadb-database-create" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.998310 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="547b221a-e3b1-4a31-b09c-07022356f1e9" containerName="swift-ring-rebalance" Sep 30 05:44:23 crc kubenswrapper[4956]: I0930 05:44:23.998937 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pspqn" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.008254 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pspqn"] Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.089161 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 05:44:24 crc kubenswrapper[4956]: W0930 05:44:24.099406 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38f02895_66c4_4da2_b408_838646d7ecbd.slice/crio-478b01054b555f79d32b50484bc95636278360f1a9938ac1ce3193620468c17a WatchSource:0}: Error finding container 478b01054b555f79d32b50484bc95636278360f1a9938ac1ce3193620468c17a: Status 404 returned error can't find the container with id 478b01054b555f79d32b50484bc95636278360f1a9938ac1ce3193620468c17a Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.117863 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhj7j\" (UniqueName: \"kubernetes.io/projected/d89a6444-ef65-4aeb-ba5f-cce1b00ae461-kube-api-access-hhj7j\") pod \"keystone-db-create-pspqn\" (UID: \"d89a6444-ef65-4aeb-ba5f-cce1b00ae461\") " pod="openstack/keystone-db-create-pspqn" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.219196 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhj7j\" (UniqueName: \"kubernetes.io/projected/d89a6444-ef65-4aeb-ba5f-cce1b00ae461-kube-api-access-hhj7j\") pod \"keystone-db-create-pspqn\" (UID: \"d89a6444-ef65-4aeb-ba5f-cce1b00ae461\") " pod="openstack/keystone-db-create-pspqn" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.235452 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhj7j\" (UniqueName: \"kubernetes.io/projected/d89a6444-ef65-4aeb-ba5f-cce1b00ae461-kube-api-access-hhj7j\") pod \"keystone-db-create-pspqn\" (UID: \"d89a6444-ef65-4aeb-ba5f-cce1b00ae461\") " pod="openstack/keystone-db-create-pspqn" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.320354 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pspqn" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.447298 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-z6kts"] Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.454011 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z6kts" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.477584 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-z6kts"] Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.542516 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hl6g\" (UniqueName: \"kubernetes.io/projected/47c03935-f3af-4925-8dcc-d4b6c6906cf0-kube-api-access-9hl6g\") pod \"placement-db-create-z6kts\" (UID: \"47c03935-f3af-4925-8dcc-d4b6c6906cf0\") " pod="openstack/placement-db-create-z6kts" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.644149 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hl6g\" (UniqueName: \"kubernetes.io/projected/47c03935-f3af-4925-8dcc-d4b6c6906cf0-kube-api-access-9hl6g\") pod \"placement-db-create-z6kts\" (UID: \"47c03935-f3af-4925-8dcc-d4b6c6906cf0\") " pod="openstack/placement-db-create-z6kts" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.663299 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hl6g\" (UniqueName: \"kubernetes.io/projected/47c03935-f3af-4925-8dcc-d4b6c6906cf0-kube-api-access-9hl6g\") pod \"placement-db-create-z6kts\" (UID: \"47c03935-f3af-4925-8dcc-d4b6c6906cf0\") " pod="openstack/placement-db-create-z6kts" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.684824 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pjvh9"] Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.689277 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pjvh9" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.703347 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pjvh9"] Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.775650 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z6kts" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.847354 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5448l\" (UniqueName: \"kubernetes.io/projected/417d8763-c01b-4a78-aabe-76229bf38f79-kube-api-access-5448l\") pod \"glance-db-create-pjvh9\" (UID: \"417d8763-c01b-4a78-aabe-76229bf38f79\") " pod="openstack/glance-db-create-pjvh9" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.888376 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pspqn"] Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.919734 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"478b01054b555f79d32b50484bc95636278360f1a9938ac1ce3193620468c17a"} Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.948387 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5448l\" (UniqueName: \"kubernetes.io/projected/417d8763-c01b-4a78-aabe-76229bf38f79-kube-api-access-5448l\") pod \"glance-db-create-pjvh9\" (UID: \"417d8763-c01b-4a78-aabe-76229bf38f79\") " pod="openstack/glance-db-create-pjvh9" Sep 30 05:44:24 crc kubenswrapper[4956]: I0930 05:44:24.968437 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5448l\" (UniqueName: \"kubernetes.io/projected/417d8763-c01b-4a78-aabe-76229bf38f79-kube-api-access-5448l\") pod \"glance-db-create-pjvh9\" (UID: \"417d8763-c01b-4a78-aabe-76229bf38f79\") " pod="openstack/glance-db-create-pjvh9" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.017085 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pjvh9" Sep 30 05:44:25 crc kubenswrapper[4956]: W0930 05:44:25.160057 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd89a6444_ef65_4aeb_ba5f_cce1b00ae461.slice/crio-ea3583a1ab17d6fd16f3da9928554fb04c45ee26d95fe7eb9c0907a3d95f7d22 WatchSource:0}: Error finding container ea3583a1ab17d6fd16f3da9928554fb04c45ee26d95fe7eb9c0907a3d95f7d22: Status 404 returned error can't find the container with id ea3583a1ab17d6fd16f3da9928554fb04c45ee26d95fe7eb9c0907a3d95f7d22 Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.201345 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-z6kts"] Sep 30 05:44:25 crc kubenswrapper[4956]: W0930 05:44:25.210097 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47c03935_f3af_4925_8dcc_d4b6c6906cf0.slice/crio-bebc39e76058880f4a8067812e96dfec7ded1ebe9ad58063a06e9b1d4a098128 WatchSource:0}: Error finding container bebc39e76058880f4a8067812e96dfec7ded1ebe9ad58063a06e9b1d4a098128: Status 404 returned error can't find the container with id bebc39e76058880f4a8067812e96dfec7ded1ebe9ad58063a06e9b1d4a098128 Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.296155 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.345222 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.484445 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.487910 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.493807 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.494613 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-46c98" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.494615 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.494770 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.495781 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.559611 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0fedcc9-8df5-495f-adb8-a42a2a811c49-scripts\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.559684 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.559731 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.559791 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0fedcc9-8df5-495f-adb8-a42a2a811c49-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.559814 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.559835 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fedcc9-8df5-495f-adb8-a42a2a811c49-config\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.559869 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb4d6\" (UniqueName: \"kubernetes.io/projected/c0fedcc9-8df5-495f-adb8-a42a2a811c49-kube-api-access-pb4d6\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.634006 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pjvh9"] Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.662193 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0fedcc9-8df5-495f-adb8-a42a2a811c49-scripts\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.662246 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.662279 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.662326 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0fedcc9-8df5-495f-adb8-a42a2a811c49-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.662341 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.662356 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fedcc9-8df5-495f-adb8-a42a2a811c49-config\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.662380 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb4d6\" (UniqueName: \"kubernetes.io/projected/c0fedcc9-8df5-495f-adb8-a42a2a811c49-kube-api-access-pb4d6\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.663743 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c0fedcc9-8df5-495f-adb8-a42a2a811c49-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.663893 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0fedcc9-8df5-495f-adb8-a42a2a811c49-config\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.664067 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0fedcc9-8df5-495f-adb8-a42a2a811c49-scripts\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.668463 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.669048 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.669421 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0fedcc9-8df5-495f-adb8-a42a2a811c49-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.679560 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb4d6\" (UniqueName: \"kubernetes.io/projected/c0fedcc9-8df5-495f-adb8-a42a2a811c49-kube-api-access-pb4d6\") pod \"ovn-northd-0\" (UID: \"c0fedcc9-8df5-495f-adb8-a42a2a811c49\") " pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.821194 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.931314 4956 generic.go:334] "Generic (PLEG): container finished" podID="47c03935-f3af-4925-8dcc-d4b6c6906cf0" containerID="5e82cb4d8be83def87bd14d177e7a5d18f58cb4811029d7ffe387a8978e1070b" exitCode=0 Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.931389 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z6kts" event={"ID":"47c03935-f3af-4925-8dcc-d4b6c6906cf0","Type":"ContainerDied","Data":"5e82cb4d8be83def87bd14d177e7a5d18f58cb4811029d7ffe387a8978e1070b"} Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.931415 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z6kts" event={"ID":"47c03935-f3af-4925-8dcc-d4b6c6906cf0","Type":"ContainerStarted","Data":"bebc39e76058880f4a8067812e96dfec7ded1ebe9ad58063a06e9b1d4a098128"} Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.940407 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"87135fc993be2370e18a836f8291486d28f3372b0294b226ebc3d2e5b748a0be"} Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.940448 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"30391e21cbffc6ea98d9f9fcfebc7854e83214c9245f3b57d0a1b1165b9aff9d"} Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.942927 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pjvh9" event={"ID":"417d8763-c01b-4a78-aabe-76229bf38f79","Type":"ContainerStarted","Data":"a9fea60df5a048553fc067fbaf2628453ccdd923dd95567a529f794d75430890"} Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.942969 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pjvh9" event={"ID":"417d8763-c01b-4a78-aabe-76229bf38f79","Type":"ContainerStarted","Data":"35eb4f95c5e1d722943d0726aa78b1b45bb5a7e3ae1dc41b3f860bfe8fb9f61a"} Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.957681 4956 generic.go:334] "Generic (PLEG): container finished" podID="d89a6444-ef65-4aeb-ba5f-cce1b00ae461" containerID="2057be081c11543283d04c30ea09d2959e6322cd9857be78d2086c6c1800af53" exitCode=0 Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.957919 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pspqn" event={"ID":"d89a6444-ef65-4aeb-ba5f-cce1b00ae461","Type":"ContainerDied","Data":"2057be081c11543283d04c30ea09d2959e6322cd9857be78d2086c6c1800af53"} Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.957994 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pspqn" event={"ID":"d89a6444-ef65-4aeb-ba5f-cce1b00ae461","Type":"ContainerStarted","Data":"ea3583a1ab17d6fd16f3da9928554fb04c45ee26d95fe7eb9c0907a3d95f7d22"} Sep 30 05:44:25 crc kubenswrapper[4956]: I0930 05:44:25.969114 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-pjvh9" podStartSLOduration=1.9690935760000001 podStartE2EDuration="1.969093576s" podCreationTimestamp="2025-09-30 05:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:25.962345722 +0000 UTC m=+936.289466247" watchObservedRunningTime="2025-09-30 05:44:25.969093576 +0000 UTC m=+936.296214101" Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.280725 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 05:44:26 crc kubenswrapper[4956]: W0930 05:44:26.290580 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0fedcc9_8df5_495f_adb8_a42a2a811c49.slice/crio-390b1f3efc4bfd3f94da6d896ed4f254998ceec27d61b2497c0514e3eb018a49 WatchSource:0}: Error finding container 390b1f3efc4bfd3f94da6d896ed4f254998ceec27d61b2497c0514e3eb018a49: Status 404 returned error can't find the container with id 390b1f3efc4bfd3f94da6d896ed4f254998ceec27d61b2497c0514e3eb018a49 Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.478228 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-ac03-account-create-dzz6s"] Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.480339 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-ac03-account-create-dzz6s" Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.483815 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.485872 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-ac03-account-create-dzz6s"] Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.577754 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbfxt\" (UniqueName: \"kubernetes.io/projected/5d3549a7-c08c-4c47-a587-9a9672ef54f7-kube-api-access-vbfxt\") pod \"watcher-ac03-account-create-dzz6s\" (UID: \"5d3549a7-c08c-4c47-a587-9a9672ef54f7\") " pod="openstack/watcher-ac03-account-create-dzz6s" Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.679280 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbfxt\" (UniqueName: \"kubernetes.io/projected/5d3549a7-c08c-4c47-a587-9a9672ef54f7-kube-api-access-vbfxt\") pod \"watcher-ac03-account-create-dzz6s\" (UID: \"5d3549a7-c08c-4c47-a587-9a9672ef54f7\") " pod="openstack/watcher-ac03-account-create-dzz6s" Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.711056 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbfxt\" (UniqueName: \"kubernetes.io/projected/5d3549a7-c08c-4c47-a587-9a9672ef54f7-kube-api-access-vbfxt\") pod \"watcher-ac03-account-create-dzz6s\" (UID: \"5d3549a7-c08c-4c47-a587-9a9672ef54f7\") " pod="openstack/watcher-ac03-account-create-dzz6s" Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.807101 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-ac03-account-create-dzz6s" Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.968741 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c0fedcc9-8df5-495f-adb8-a42a2a811c49","Type":"ContainerStarted","Data":"390b1f3efc4bfd3f94da6d896ed4f254998ceec27d61b2497c0514e3eb018a49"} Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.971221 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"f68b38a2a384aac05fc4dbd812cfe9e2eb96f29e8838cb405b0a158257b8cf0c"} Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.971246 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"1c2ca536e1ef4cd0840909fd6be2017d377f466f211c99a47423c12cdff3072f"} Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.972425 4956 generic.go:334] "Generic (PLEG): container finished" podID="417d8763-c01b-4a78-aabe-76229bf38f79" containerID="a9fea60df5a048553fc067fbaf2628453ccdd923dd95567a529f794d75430890" exitCode=0 Sep 30 05:44:26 crc kubenswrapper[4956]: I0930 05:44:26.972879 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pjvh9" event={"ID":"417d8763-c01b-4a78-aabe-76229bf38f79","Type":"ContainerDied","Data":"a9fea60df5a048553fc067fbaf2628453ccdd923dd95567a529f794d75430890"} Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.504420 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z6kts" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.537600 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pspqn" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.593006 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.599880 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hl6g\" (UniqueName: \"kubernetes.io/projected/47c03935-f3af-4925-8dcc-d4b6c6906cf0-kube-api-access-9hl6g\") pod \"47c03935-f3af-4925-8dcc-d4b6c6906cf0\" (UID: \"47c03935-f3af-4925-8dcc-d4b6c6906cf0\") " Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.599977 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhj7j\" (UniqueName: \"kubernetes.io/projected/d89a6444-ef65-4aeb-ba5f-cce1b00ae461-kube-api-access-hhj7j\") pod \"d89a6444-ef65-4aeb-ba5f-cce1b00ae461\" (UID: \"d89a6444-ef65-4aeb-ba5f-cce1b00ae461\") " Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.604888 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.604993 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c03935-f3af-4925-8dcc-d4b6c6906cf0-kube-api-access-9hl6g" (OuterVolumeSpecName: "kube-api-access-9hl6g") pod "47c03935-f3af-4925-8dcc-d4b6c6906cf0" (UID: "47c03935-f3af-4925-8dcc-d4b6c6906cf0"). InnerVolumeSpecName "kube-api-access-9hl6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.634226 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89a6444-ef65-4aeb-ba5f-cce1b00ae461-kube-api-access-hhj7j" (OuterVolumeSpecName: "kube-api-access-hhj7j") pod "d89a6444-ef65-4aeb-ba5f-cce1b00ae461" (UID: "d89a6444-ef65-4aeb-ba5f-cce1b00ae461"). InnerVolumeSpecName "kube-api-access-hhj7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.683457 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-ac03-account-create-dzz6s"] Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.703250 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hl6g\" (UniqueName: \"kubernetes.io/projected/47c03935-f3af-4925-8dcc-d4b6c6906cf0-kube-api-access-9hl6g\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.703284 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhj7j\" (UniqueName: \"kubernetes.io/projected/d89a6444-ef65-4aeb-ba5f-cce1b00ae461-kube-api-access-hhj7j\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.993931 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z6kts" event={"ID":"47c03935-f3af-4925-8dcc-d4b6c6906cf0","Type":"ContainerDied","Data":"bebc39e76058880f4a8067812e96dfec7ded1ebe9ad58063a06e9b1d4a098128"} Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.993971 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bebc39e76058880f4a8067812e96dfec7ded1ebe9ad58063a06e9b1d4a098128" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.994029 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z6kts" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.996048 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c0fedcc9-8df5-495f-adb8-a42a2a811c49","Type":"ContainerStarted","Data":"bb7076b424366b35917ded3edadeb8ac3e102fd2d259db5b6545b64a5485ff82"} Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.996093 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c0fedcc9-8df5-495f-adb8-a42a2a811c49","Type":"ContainerStarted","Data":"66b0d6091ad64f7c2ecb81a99e085bcdbe41202f9f79455124bf30987f09bfe5"} Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.997193 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.999412 4956 generic.go:334] "Generic (PLEG): container finished" podID="5d3549a7-c08c-4c47-a587-9a9672ef54f7" containerID="2301fbabd5b9cc312adc9c880f268328ff09ef0b14a3c14a205a007ec8b3a3b7" exitCode=0 Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.999459 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-ac03-account-create-dzz6s" event={"ID":"5d3549a7-c08c-4c47-a587-9a9672ef54f7","Type":"ContainerDied","Data":"2301fbabd5b9cc312adc9c880f268328ff09ef0b14a3c14a205a007ec8b3a3b7"} Sep 30 05:44:27 crc kubenswrapper[4956]: I0930 05:44:27.999476 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-ac03-account-create-dzz6s" event={"ID":"5d3549a7-c08c-4c47-a587-9a9672ef54f7","Type":"ContainerStarted","Data":"8b1e6482bbd60b8835df502e77c657383158b66a493c952cd44908ad315b9bde"} Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.002974 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"efec162687afd16e6682f3746af5543258d7e6b0267060133dcc4e2d56c041cc"} Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.003000 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"0583a9ba93cca87163681f58ed0d881ff78ea03670729a516afd563f3bcca1ae"} Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.003010 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"a58f0ca49289537a11e5d5a0821c0603c1876058701df3892d71cd1384882c97"} Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.004282 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pspqn" Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.005477 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pspqn" event={"ID":"d89a6444-ef65-4aeb-ba5f-cce1b00ae461","Type":"ContainerDied","Data":"ea3583a1ab17d6fd16f3da9928554fb04c45ee26d95fe7eb9c0907a3d95f7d22"} Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.005519 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea3583a1ab17d6fd16f3da9928554fb04c45ee26d95fe7eb9c0907a3d95f7d22" Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.007673 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.028746 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.077462803 podStartE2EDuration="3.028727111s" podCreationTimestamp="2025-09-30 05:44:25 +0000 UTC" firstStartedPulling="2025-09-30 05:44:26.292485977 +0000 UTC m=+936.619606502" lastFinishedPulling="2025-09-30 05:44:27.243750285 +0000 UTC m=+937.570870810" observedRunningTime="2025-09-30 05:44:28.020824273 +0000 UTC m=+938.347944818" watchObservedRunningTime="2025-09-30 05:44:28.028727111 +0000 UTC m=+938.355847636" Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.392686 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pjvh9" Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.518690 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5448l\" (UniqueName: \"kubernetes.io/projected/417d8763-c01b-4a78-aabe-76229bf38f79-kube-api-access-5448l\") pod \"417d8763-c01b-4a78-aabe-76229bf38f79\" (UID: \"417d8763-c01b-4a78-aabe-76229bf38f79\") " Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.531318 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417d8763-c01b-4a78-aabe-76229bf38f79-kube-api-access-5448l" (OuterVolumeSpecName: "kube-api-access-5448l") pod "417d8763-c01b-4a78-aabe-76229bf38f79" (UID: "417d8763-c01b-4a78-aabe-76229bf38f79"). InnerVolumeSpecName "kube-api-access-5448l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:28 crc kubenswrapper[4956]: I0930 05:44:28.621350 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5448l\" (UniqueName: \"kubernetes.io/projected/417d8763-c01b-4a78-aabe-76229bf38f79-kube-api-access-5448l\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:29 crc kubenswrapper[4956]: I0930 05:44:29.011571 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pjvh9" event={"ID":"417d8763-c01b-4a78-aabe-76229bf38f79","Type":"ContainerDied","Data":"35eb4f95c5e1d722943d0726aa78b1b45bb5a7e3ae1dc41b3f860bfe8fb9f61a"} Sep 30 05:44:29 crc kubenswrapper[4956]: I0930 05:44:29.011611 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35eb4f95c5e1d722943d0726aa78b1b45bb5a7e3ae1dc41b3f860bfe8fb9f61a" Sep 30 05:44:29 crc kubenswrapper[4956]: I0930 05:44:29.011664 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pjvh9" Sep 30 05:44:29 crc kubenswrapper[4956]: I0930 05:44:29.019229 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"a1a14fe0fee4174dad5f17f2441e3f4d5f0dcc835d2c8db96bf04f8ecf79d547"} Sep 30 05:44:29 crc kubenswrapper[4956]: I0930 05:44:29.386207 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-ac03-account-create-dzz6s" Sep 30 05:44:29 crc kubenswrapper[4956]: I0930 05:44:29.432865 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbfxt\" (UniqueName: \"kubernetes.io/projected/5d3549a7-c08c-4c47-a587-9a9672ef54f7-kube-api-access-vbfxt\") pod \"5d3549a7-c08c-4c47-a587-9a9672ef54f7\" (UID: \"5d3549a7-c08c-4c47-a587-9a9672ef54f7\") " Sep 30 05:44:29 crc kubenswrapper[4956]: I0930 05:44:29.445387 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3549a7-c08c-4c47-a587-9a9672ef54f7-kube-api-access-vbfxt" (OuterVolumeSpecName: "kube-api-access-vbfxt") pod "5d3549a7-c08c-4c47-a587-9a9672ef54f7" (UID: "5d3549a7-c08c-4c47-a587-9a9672ef54f7"). InnerVolumeSpecName "kube-api-access-vbfxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:29 crc kubenswrapper[4956]: I0930 05:44:29.534465 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbfxt\" (UniqueName: \"kubernetes.io/projected/5d3549a7-c08c-4c47-a587-9a9672ef54f7-kube-api-access-vbfxt\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.030142 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-ac03-account-create-dzz6s" event={"ID":"5d3549a7-c08c-4c47-a587-9a9672ef54f7","Type":"ContainerDied","Data":"8b1e6482bbd60b8835df502e77c657383158b66a493c952cd44908ad315b9bde"} Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.030509 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1e6482bbd60b8835df502e77c657383158b66a493c952cd44908ad315b9bde" Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.030158 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-ac03-account-create-dzz6s" Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.031550 4956 generic.go:334] "Generic (PLEG): container finished" podID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerID="1dd46384bd89f5a02cbb07228ce6eb6563ced7e59f1d12f3cc83c531865ff7de" exitCode=0 Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.031605 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4","Type":"ContainerDied","Data":"1dd46384bd89f5a02cbb07228ce6eb6563ced7e59f1d12f3cc83c531865ff7de"} Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.036634 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"45a08a40670c28cc8384561c541ac701ae3789d08fbfe8754cd1eb39a4737142"} Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.036673 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"df56ec816659b785ca2a911ad4b337e76a684aad461f61c84558d4b06760fc5c"} Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.036685 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"1cb3767ce56f35d496861c29ec1636c2f220a250ffd8da06db1b0b7931a655d5"} Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.036694 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"26cd987ab166f542832853cf5338fc99a17c613a2c06aca1214779c25cde904f"} Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.927950 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.929397 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="prometheus" containerID="cri-o://0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8" gracePeriod=600 Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.929417 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="config-reloader" containerID="cri-o://af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180" gracePeriod=600 Sep 30 05:44:30 crc kubenswrapper[4956]: I0930 05:44:30.929492 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="thanos-sidecar" containerID="cri-o://3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b" gracePeriod=600 Sep 30 05:44:31 crc kubenswrapper[4956]: I0930 05:44:31.046110 4956 generic.go:334] "Generic (PLEG): container finished" podID="de3a8c94-71b5-4948-9079-cc7009b9a8ea" containerID="07f92de04e4e412abde86722297c918552f19606b53452424118780ea3a7b636" exitCode=0 Sep 30 05:44:31 crc kubenswrapper[4956]: I0930 05:44:31.046155 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"de3a8c94-71b5-4948-9079-cc7009b9a8ea","Type":"ContainerDied","Data":"07f92de04e4e412abde86722297c918552f19606b53452424118780ea3a7b636"} Sep 30 05:44:31 crc kubenswrapper[4956]: I0930 05:44:31.048680 4956 generic.go:334] "Generic (PLEG): container finished" podID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerID="a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25" exitCode=0 Sep 30 05:44:31 crc kubenswrapper[4956]: I0930 05:44:31.048717 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ae54b47-b5ac-43a0-9752-797d2f81ff29","Type":"ContainerDied","Data":"a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25"} Sep 30 05:44:32 crc kubenswrapper[4956]: I0930 05:44:32.593997 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.106722 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5dab-account-create-crhmx"] Sep 30 05:44:34 crc kubenswrapper[4956]: E0930 05:44:34.112785 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417d8763-c01b-4a78-aabe-76229bf38f79" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.112845 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="417d8763-c01b-4a78-aabe-76229bf38f79" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: E0930 05:44:34.113471 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c03935-f3af-4925-8dcc-d4b6c6906cf0" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.113488 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c03935-f3af-4925-8dcc-d4b6c6906cf0" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: E0930 05:44:34.113521 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89a6444-ef65-4aeb-ba5f-cce1b00ae461" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.113530 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89a6444-ef65-4aeb-ba5f-cce1b00ae461" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: E0930 05:44:34.113549 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3549a7-c08c-4c47-a587-9a9672ef54f7" containerName="mariadb-account-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.113558 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3549a7-c08c-4c47-a587-9a9672ef54f7" containerName="mariadb-account-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.115002 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89a6444-ef65-4aeb-ba5f-cce1b00ae461" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.115039 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c03935-f3af-4925-8dcc-d4b6c6906cf0" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.115072 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3549a7-c08c-4c47-a587-9a9672ef54f7" containerName="mariadb-account-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.115100 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="417d8763-c01b-4a78-aabe-76229bf38f79" containerName="mariadb-database-create" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.118366 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5dab-account-create-crhmx" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.121679 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.124305 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5dab-account-create-crhmx"] Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.213479 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjml\" (UniqueName: \"kubernetes.io/projected/4234d793-0813-4224-8b30-754326f3a4e2-kube-api-access-gbjml\") pod \"keystone-5dab-account-create-crhmx\" (UID: \"4234d793-0813-4224-8b30-754326f3a4e2\") " pod="openstack/keystone-5dab-account-create-crhmx" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.314567 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjml\" (UniqueName: \"kubernetes.io/projected/4234d793-0813-4224-8b30-754326f3a4e2-kube-api-access-gbjml\") pod \"keystone-5dab-account-create-crhmx\" (UID: \"4234d793-0813-4224-8b30-754326f3a4e2\") " pod="openstack/keystone-5dab-account-create-crhmx" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.335435 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjml\" (UniqueName: \"kubernetes.io/projected/4234d793-0813-4224-8b30-754326f3a4e2-kube-api-access-gbjml\") pod \"keystone-5dab-account-create-crhmx\" (UID: \"4234d793-0813-4224-8b30-754326f3a4e2\") " pod="openstack/keystone-5dab-account-create-crhmx" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.442020 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5dab-account-create-crhmx" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.558291 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f878-account-create-cs4z9"] Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.559958 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f878-account-create-cs4z9" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.570771 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f878-account-create-cs4z9"] Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.585300 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.623765 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mqd\" (UniqueName: \"kubernetes.io/projected/b78cea6c-ba5c-4c6f-ace8-b815445520ed-kube-api-access-59mqd\") pod \"placement-f878-account-create-cs4z9\" (UID: \"b78cea6c-ba5c-4c6f-ace8-b815445520ed\") " pod="openstack/placement-f878-account-create-cs4z9" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.725091 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mqd\" (UniqueName: \"kubernetes.io/projected/b78cea6c-ba5c-4c6f-ace8-b815445520ed-kube-api-access-59mqd\") pod \"placement-f878-account-create-cs4z9\" (UID: \"b78cea6c-ba5c-4c6f-ace8-b815445520ed\") " pod="openstack/placement-f878-account-create-cs4z9" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.739701 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a5a5-account-create-4v8x2"] Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.740950 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a5a5-account-create-4v8x2" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.742849 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.754140 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a5a5-account-create-4v8x2"] Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.759862 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mqd\" (UniqueName: \"kubernetes.io/projected/b78cea6c-ba5c-4c6f-ace8-b815445520ed-kube-api-access-59mqd\") pod \"placement-f878-account-create-cs4z9\" (UID: \"b78cea6c-ba5c-4c6f-ace8-b815445520ed\") " pod="openstack/placement-f878-account-create-cs4z9" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.826754 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhsng\" (UniqueName: \"kubernetes.io/projected/1d1987b5-707c-4084-a639-b626742fb1a3-kube-api-access-zhsng\") pod \"glance-a5a5-account-create-4v8x2\" (UID: \"1d1987b5-707c-4084-a639-b626742fb1a3\") " pod="openstack/glance-a5a5-account-create-4v8x2" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.927675 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f878-account-create-cs4z9" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.929566 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhsng\" (UniqueName: \"kubernetes.io/projected/1d1987b5-707c-4084-a639-b626742fb1a3-kube-api-access-zhsng\") pod \"glance-a5a5-account-create-4v8x2\" (UID: \"1d1987b5-707c-4084-a639-b626742fb1a3\") " pod="openstack/glance-a5a5-account-create-4v8x2" Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.932282 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5dab-account-create-crhmx"] Sep 30 05:44:34 crc kubenswrapper[4956]: I0930 05:44:34.952184 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhsng\" (UniqueName: \"kubernetes.io/projected/1d1987b5-707c-4084-a639-b626742fb1a3-kube-api-access-zhsng\") pod \"glance-a5a5-account-create-4v8x2\" (UID: \"1d1987b5-707c-4084-a639-b626742fb1a3\") " pod="openstack/glance-a5a5-account-create-4v8x2" Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.061636 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a5a5-account-create-4v8x2" Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.092836 4956 generic.go:334] "Generic (PLEG): container finished" podID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerID="3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b" exitCode=0 Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.092867 4956 generic.go:334] "Generic (PLEG): container finished" podID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerID="af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180" exitCode=0 Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.092937 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerDied","Data":"3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b"} Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.092979 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerDied","Data":"af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180"} Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.100764 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4","Type":"ContainerStarted","Data":"60e10f761fe551d720bc4696359254ed9521fee7c50401375c9354a6c9aabe0a"} Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.101785 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5dab-account-create-crhmx" event={"ID":"4234d793-0813-4224-8b30-754326f3a4e2","Type":"ContainerStarted","Data":"d333d4858839605ee4868e965e20e72fee1829d97bdb7de7ed3062328a01863a"} Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.107315 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"c699974d352642824e4f90e59175c8d5d78c9285b23a95b5dec7fef4f691b274"} Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.383787 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f878-account-create-cs4z9"] Sep 30 05:44:35 crc kubenswrapper[4956]: I0930 05:44:35.504406 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a5a5-account-create-4v8x2"] Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.077301 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.119290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ae54b47-b5ac-43a0-9752-797d2f81ff29","Type":"ContainerStarted","Data":"e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.120553 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.122749 4956 generic.go:334] "Generic (PLEG): container finished" podID="4234d793-0813-4224-8b30-754326f3a4e2" containerID="5eddfba933b9bac2e24149a63bae92b7951a443550ba8aaf8b3374dd4853caf1" exitCode=0 Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.122855 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5dab-account-create-crhmx" event={"ID":"4234d793-0813-4224-8b30-754326f3a4e2","Type":"ContainerDied","Data":"5eddfba933b9bac2e24149a63bae92b7951a443550ba8aaf8b3374dd4853caf1"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.134800 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"4f1e7bc90097f8a2e66590cf04da5e59b535dac38c1b5ccf64028dbdf204a93f"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.134864 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"38f02895-66c4-4da2-b408-838646d7ecbd","Type":"ContainerStarted","Data":"44ec83c8e2f5c7a050f7eb1b3ff68110e6debf08ba488cc8b323aa380567eb1e"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.137066 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"de3a8c94-71b5-4948-9079-cc7009b9a8ea","Type":"ContainerStarted","Data":"64bdefbf24bdb77a075396d920febfd909d950eaec3308549ba3620b38593468"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.138403 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.141065 4956 generic.go:334] "Generic (PLEG): container finished" podID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerID="0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8" exitCode=0 Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.141170 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerDied","Data":"0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.141181 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.141197 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f68c4eb3-451f-460b-996f-3a36ac7da7e2","Type":"ContainerDied","Data":"15fb3583c205c5af8fc5383f2d4602dbaef094800684712599f1b72cc950bb55"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.141224 4956 scope.go:117] "RemoveContainer" containerID="3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.155393 4956 generic.go:334] "Generic (PLEG): container finished" podID="1d1987b5-707c-4084-a639-b626742fb1a3" containerID="b3507a67deaa2910720a6fbbca070c38ed9a2ec79e7100f4aa6a081510852367" exitCode=0 Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.155550 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a5a5-account-create-4v8x2" event={"ID":"1d1987b5-707c-4084-a639-b626742fb1a3","Type":"ContainerDied","Data":"b3507a67deaa2910720a6fbbca070c38ed9a2ec79e7100f4aa6a081510852367"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.155592 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a5a5-account-create-4v8x2" event={"ID":"1d1987b5-707c-4084-a639-b626742fb1a3","Type":"ContainerStarted","Data":"808b1089b59ec11cc555be2aa0642850de61e9b113adbb03d0592b3196f2fce5"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.170563 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371958.684244 podStartE2EDuration="1m18.170531559s" podCreationTimestamp="2025-09-30 05:43:18 +0000 UTC" firstStartedPulling="2025-09-30 05:43:21.125978523 +0000 UTC m=+871.453099048" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:36.16994182 +0000 UTC m=+946.497062345" watchObservedRunningTime="2025-09-30 05:44:36.170531559 +0000 UTC m=+946.497652084" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.172861 4956 generic.go:334] "Generic (PLEG): container finished" podID="b78cea6c-ba5c-4c6f-ace8-b815445520ed" containerID="aa9296ad7dbe9f24f36a580c8f290af620b406ededd5aa936b8787092bf8da70" exitCode=0 Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.174024 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f878-account-create-cs4z9" event={"ID":"b78cea6c-ba5c-4c6f-ace8-b815445520ed","Type":"ContainerDied","Data":"aa9296ad7dbe9f24f36a580c8f290af620b406ededd5aa936b8787092bf8da70"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.174065 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f878-account-create-cs4z9" event={"ID":"b78cea6c-ba5c-4c6f-ace8-b815445520ed","Type":"ContainerStarted","Data":"bf6df38edfa0b1723cfd304db605819b85229a24fd1def9897e6feca51233c46"} Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.174087 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.190971 4956 scope.go:117] "RemoveContainer" containerID="af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.216032 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=-9223371959.638775 podStartE2EDuration="1m17.216001558s" podCreationTimestamp="2025-09-30 05:43:19 +0000 UTC" firstStartedPulling="2025-09-30 05:43:21.393472295 +0000 UTC m=+871.720592820" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:36.205456614 +0000 UTC m=+946.532577149" watchObservedRunningTime="2025-09-30 05:44:36.216001558 +0000 UTC m=+946.543122093" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.241318 4956 scope.go:117] "RemoveContainer" containerID="0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.257803 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-thanos-prometheus-http-client-file\") pod \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.257915 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-tls-assets\") pod \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.258012 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-web-config\") pod \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.259172 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.259212 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f68c4eb3-451f-460b-996f-3a36ac7da7e2-prometheus-metric-storage-rulefiles-0\") pod \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.259265 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btpq5\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-kube-api-access-btpq5\") pod \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.259313 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config-out\") pod \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.259380 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config\") pod \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\" (UID: \"f68c4eb3-451f-460b-996f-3a36ac7da7e2\") " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.261375 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68c4eb3-451f-460b-996f-3a36ac7da7e2-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "f68c4eb3-451f-460b-996f-3a36ac7da7e2" (UID: "f68c4eb3-451f-460b-996f-3a36ac7da7e2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.261991 4956 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f68c4eb3-451f-460b-996f-3a36ac7da7e2-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.267355 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f68c4eb3-451f-460b-996f-3a36ac7da7e2" (UID: "f68c4eb3-451f-460b-996f-3a36ac7da7e2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.267526 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config" (OuterVolumeSpecName: "config") pod "f68c4eb3-451f-460b-996f-3a36ac7da7e2" (UID: "f68c4eb3-451f-460b-996f-3a36ac7da7e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.270993 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-kube-api-access-btpq5" (OuterVolumeSpecName: "kube-api-access-btpq5") pod "f68c4eb3-451f-460b-996f-3a36ac7da7e2" (UID: "f68c4eb3-451f-460b-996f-3a36ac7da7e2"). InnerVolumeSpecName "kube-api-access-btpq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.272629 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config-out" (OuterVolumeSpecName: "config-out") pod "f68c4eb3-451f-460b-996f-3a36ac7da7e2" (UID: "f68c4eb3-451f-460b-996f-3a36ac7da7e2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.278905 4956 scope.go:117] "RemoveContainer" containerID="d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.284450 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f68c4eb3-451f-460b-996f-3a36ac7da7e2" (UID: "f68c4eb3-451f-460b-996f-3a36ac7da7e2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.300212 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=25.537608761 podStartE2EDuration="30.300168301s" podCreationTimestamp="2025-09-30 05:44:06 +0000 UTC" firstStartedPulling="2025-09-30 05:44:24.103105361 +0000 UTC m=+934.430225886" lastFinishedPulling="2025-09-30 05:44:28.865664901 +0000 UTC m=+939.192785426" observedRunningTime="2025-09-30 05:44:36.28621902 +0000 UTC m=+946.613339565" watchObservedRunningTime="2025-09-30 05:44:36.300168301 +0000 UTC m=+946.627288846" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.325396 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-web-config" (OuterVolumeSpecName: "web-config") pod "f68c4eb3-451f-460b-996f-3a36ac7da7e2" (UID: "f68c4eb3-451f-460b-996f-3a36ac7da7e2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.333710 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.942506137 podStartE2EDuration="1m17.333682272s" podCreationTimestamp="2025-09-30 05:43:19 +0000 UTC" firstStartedPulling="2025-09-30 05:43:21.170505363 +0000 UTC m=+871.497625888" lastFinishedPulling="2025-09-30 05:43:58.561681498 +0000 UTC m=+908.888802023" observedRunningTime="2025-09-30 05:44:36.313850754 +0000 UTC m=+946.640971289" watchObservedRunningTime="2025-09-30 05:44:36.333682272 +0000 UTC m=+946.660802827" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.337864 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "f68c4eb3-451f-460b-996f-3a36ac7da7e2" (UID: "f68c4eb3-451f-460b-996f-3a36ac7da7e2"). InnerVolumeSpecName "pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.353690 4956 scope.go:117] "RemoveContainer" containerID="3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b" Sep 30 05:44:36 crc kubenswrapper[4956]: E0930 05:44:36.362543 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b\": container with ID starting with 3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b not found: ID does not exist" containerID="3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.362904 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b"} err="failed to get container status \"3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b\": rpc error: code = NotFound desc = could not find container \"3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b\": container with ID starting with 3008b60fe302d9206264ccd655c76e2d078c6116518f78f93a0ae6db6fb9e23b not found: ID does not exist" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.365098 4956 scope.go:117] "RemoveContainer" containerID="af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.366852 4956 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.366916 4956 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.366962 4956 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.367011 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") on node \"crc\" " Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.367037 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btpq5\" (UniqueName: \"kubernetes.io/projected/f68c4eb3-451f-460b-996f-3a36ac7da7e2-kube-api-access-btpq5\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.367052 4956 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.367064 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f68c4eb3-451f-460b-996f-3a36ac7da7e2-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:36 crc kubenswrapper[4956]: E0930 05:44:36.368494 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180\": container with ID starting with af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180 not found: ID does not exist" containerID="af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.368551 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180"} err="failed to get container status \"af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180\": rpc error: code = NotFound desc = could not find container \"af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180\": container with ID starting with af3a3da1e7e91db12d9ba7a3d89b5b7f5339bbdc4e037b8e667c2ff825916180 not found: ID does not exist" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.368595 4956 scope.go:117] "RemoveContainer" containerID="0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8" Sep 30 05:44:36 crc kubenswrapper[4956]: E0930 05:44:36.369066 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8\": container with ID starting with 0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8 not found: ID does not exist" containerID="0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.369091 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8"} err="failed to get container status \"0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8\": rpc error: code = NotFound desc = could not find container \"0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8\": container with ID starting with 0ac804664dfe12f5cb64a3b892120c5c7d5841c7e274fdfe82b115767fb08ff8 not found: ID does not exist" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.369107 4956 scope.go:117] "RemoveContainer" containerID="d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf" Sep 30 05:44:36 crc kubenswrapper[4956]: E0930 05:44:36.377015 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf\": container with ID starting with d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf not found: ID does not exist" containerID="d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.377086 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf"} err="failed to get container status \"d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf\": rpc error: code = NotFound desc = could not find container \"d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf\": container with ID starting with d77eb0cfddb1f45cbacce87234ca631853001469f8ce60c4a37151e6a621f3bf not found: ID does not exist" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.425828 4956 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.425984 4956 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423") on node "crc" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.470467 4956 reconciler_common.go:293] "Volume detached for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.471268 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.477800 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.506499 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:44:36 crc kubenswrapper[4956]: E0930 05:44:36.508146 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="config-reloader" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.508171 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="config-reloader" Sep 30 05:44:36 crc kubenswrapper[4956]: E0930 05:44:36.508196 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="thanos-sidecar" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.508208 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="thanos-sidecar" Sep 30 05:44:36 crc kubenswrapper[4956]: E0930 05:44:36.508217 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="init-config-reloader" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.508224 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="init-config-reloader" Sep 30 05:44:36 crc kubenswrapper[4956]: E0930 05:44:36.508267 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="prometheus" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.508275 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="prometheus" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.508440 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="prometheus" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.508454 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="thanos-sidecar" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.508467 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" containerName="config-reloader" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.510289 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.513473 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wmk85" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.513784 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.513957 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.514706 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.514971 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.516553 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.521654 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.540698 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.617402 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-hwpd5"] Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.621550 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.628734 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.632320 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-hwpd5"] Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673620 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673703 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de710cda-e9a2-426f-a617-2ac08ef16386-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673751 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/de710cda-e9a2-426f-a617-2ac08ef16386-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673785 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673804 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673831 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-config\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673867 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673895 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2jh8\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-kube-api-access-f2jh8\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.673939 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.674011 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.674062 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.776593 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.776710 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.776751 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.776787 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-config\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.776827 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.776930 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de710cda-e9a2-426f-a617-2ac08ef16386-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777020 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777079 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8tbw\" (UniqueName: \"kubernetes.io/projected/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-kube-api-access-d8tbw\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777249 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/de710cda-e9a2-426f-a617-2ac08ef16386-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777290 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777312 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777336 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-config\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777400 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777436 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2jh8\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-kube-api-access-f2jh8\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777541 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777669 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.777758 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.778449 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/de710cda-e9a2-426f-a617-2ac08ef16386-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.782654 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.782694 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dd69768781dcae5046075cd89d0649334b1721ac4e99f4c4ea45de29b58bc7b6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.783568 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.783604 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-config\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.783783 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.784003 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.784328 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.784781 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de710cda-e9a2-426f-a617-2ac08ef16386-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.784811 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.790169 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.800547 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2jh8\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-kube-api-access-f2jh8\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.812833 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.831996 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.880170 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.880707 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.880748 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.880794 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-config\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.880827 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.880880 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8tbw\" (UniqueName: \"kubernetes.io/projected/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-kube-api-access-d8tbw\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.882583 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.884102 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.884942 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.885143 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-config\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.895718 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.916698 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8tbw\" (UniqueName: \"kubernetes.io/projected/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-kube-api-access-d8tbw\") pod \"dnsmasq-dns-55b99bf79c-hwpd5\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:36 crc kubenswrapper[4956]: I0930 05:44:36.936817 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.353663 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 05:44:37 crc kubenswrapper[4956]: W0930 05:44:37.361520 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde710cda_e9a2_426f_a617_2ac08ef16386.slice/crio-0d4bf75f5af38c98c5535b6e021653db444a593f86649a701a155705f2767eba WatchSource:0}: Error finding container 0d4bf75f5af38c98c5535b6e021653db444a593f86649a701a155705f2767eba: Status 404 returned error can't find the container with id 0d4bf75f5af38c98c5535b6e021653db444a593f86649a701a155705f2767eba Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.460679 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-hwpd5"] Sep 30 05:44:37 crc kubenswrapper[4956]: W0930 05:44:37.507359 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeca6601c_9fe9_4e8e_9a47_2d68b02b06e1.slice/crio-a2b951397f44b0647db257f43474cdca5bf403fb26a963e3fb6ca202705a2c27 WatchSource:0}: Error finding container a2b951397f44b0647db257f43474cdca5bf403fb26a963e3fb6ca202705a2c27: Status 404 returned error can't find the container with id a2b951397f44b0647db257f43474cdca5bf403fb26a963e3fb6ca202705a2c27 Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.613489 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5dab-account-create-crhmx" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.675577 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a5a5-account-create-4v8x2" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.694836 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjml\" (UniqueName: \"kubernetes.io/projected/4234d793-0813-4224-8b30-754326f3a4e2-kube-api-access-gbjml\") pod \"4234d793-0813-4224-8b30-754326f3a4e2\" (UID: \"4234d793-0813-4224-8b30-754326f3a4e2\") " Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.699335 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4234d793-0813-4224-8b30-754326f3a4e2-kube-api-access-gbjml" (OuterVolumeSpecName: "kube-api-access-gbjml") pod "4234d793-0813-4224-8b30-754326f3a4e2" (UID: "4234d793-0813-4224-8b30-754326f3a4e2"). InnerVolumeSpecName "kube-api-access-gbjml". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.703775 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f878-account-create-cs4z9" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.797141 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mqd\" (UniqueName: \"kubernetes.io/projected/b78cea6c-ba5c-4c6f-ace8-b815445520ed-kube-api-access-59mqd\") pod \"b78cea6c-ba5c-4c6f-ace8-b815445520ed\" (UID: \"b78cea6c-ba5c-4c6f-ace8-b815445520ed\") " Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.797260 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhsng\" (UniqueName: \"kubernetes.io/projected/1d1987b5-707c-4084-a639-b626742fb1a3-kube-api-access-zhsng\") pod \"1d1987b5-707c-4084-a639-b626742fb1a3\" (UID: \"1d1987b5-707c-4084-a639-b626742fb1a3\") " Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.797726 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbjml\" (UniqueName: \"kubernetes.io/projected/4234d793-0813-4224-8b30-754326f3a4e2-kube-api-access-gbjml\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.800454 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1987b5-707c-4084-a639-b626742fb1a3-kube-api-access-zhsng" (OuterVolumeSpecName: "kube-api-access-zhsng") pod "1d1987b5-707c-4084-a639-b626742fb1a3" (UID: "1d1987b5-707c-4084-a639-b626742fb1a3"). InnerVolumeSpecName "kube-api-access-zhsng". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.801790 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78cea6c-ba5c-4c6f-ace8-b815445520ed-kube-api-access-59mqd" (OuterVolumeSpecName: "kube-api-access-59mqd") pod "b78cea6c-ba5c-4c6f-ace8-b815445520ed" (UID: "b78cea6c-ba5c-4c6f-ace8-b815445520ed"). InnerVolumeSpecName "kube-api-access-59mqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.899512 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59mqd\" (UniqueName: \"kubernetes.io/projected/b78cea6c-ba5c-4c6f-ace8-b815445520ed-kube-api-access-59mqd\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:37 crc kubenswrapper[4956]: I0930 05:44:37.899545 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhsng\" (UniqueName: \"kubernetes.io/projected/1d1987b5-707c-4084-a639-b626742fb1a3-kube-api-access-zhsng\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.190413 4956 generic.go:334] "Generic (PLEG): container finished" podID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerID="dcfcd45f7e03ea0e088ebb12d6de6cdba3c6ceeb13717333758e38d327cab22a" exitCode=0 Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.190493 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" event={"ID":"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1","Type":"ContainerDied","Data":"dcfcd45f7e03ea0e088ebb12d6de6cdba3c6ceeb13717333758e38d327cab22a"} Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.190520 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" event={"ID":"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1","Type":"ContainerStarted","Data":"a2b951397f44b0647db257f43474cdca5bf403fb26a963e3fb6ca202705a2c27"} Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.192818 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a5a5-account-create-4v8x2" Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.192834 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a5a5-account-create-4v8x2" event={"ID":"1d1987b5-707c-4084-a639-b626742fb1a3","Type":"ContainerDied","Data":"808b1089b59ec11cc555be2aa0642850de61e9b113adbb03d0592b3196f2fce5"} Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.192856 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="808b1089b59ec11cc555be2aa0642850de61e9b113adbb03d0592b3196f2fce5" Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.194683 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f878-account-create-cs4z9" Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.194707 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f878-account-create-cs4z9" event={"ID":"b78cea6c-ba5c-4c6f-ace8-b815445520ed","Type":"ContainerDied","Data":"bf6df38edfa0b1723cfd304db605819b85229a24fd1def9897e6feca51233c46"} Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.194783 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf6df38edfa0b1723cfd304db605819b85229a24fd1def9897e6feca51233c46" Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.196223 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5dab-account-create-crhmx" event={"ID":"4234d793-0813-4224-8b30-754326f3a4e2","Type":"ContainerDied","Data":"d333d4858839605ee4868e965e20e72fee1829d97bdb7de7ed3062328a01863a"} Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.196250 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d333d4858839605ee4868e965e20e72fee1829d97bdb7de7ed3062328a01863a" Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.196315 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5dab-account-create-crhmx" Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.199331 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerStarted","Data":"0d4bf75f5af38c98c5535b6e021653db444a593f86649a701a155705f2767eba"} Sep 30 05:44:38 crc kubenswrapper[4956]: I0930 05:44:38.420084 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68c4eb3-451f-460b-996f-3a36ac7da7e2" path="/var/lib/kubelet/pods/f68c4eb3-451f-460b-996f-3a36ac7da7e2/volumes" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.208573 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" event={"ID":"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1","Type":"ContainerStarted","Data":"02a32a5345e4055e251f2625d2879b5c43b7bad65df838c3cf1eb9a3f53a1c82"} Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.208923 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.231405 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" podStartSLOduration=3.231388117 podStartE2EDuration="3.231388117s" podCreationTimestamp="2025-09-30 05:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:39.230805348 +0000 UTC m=+949.557925883" watchObservedRunningTime="2025-09-30 05:44:39.231388117 +0000 UTC m=+949.558508642" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.948224 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6sh8d"] Sep 30 05:44:39 crc kubenswrapper[4956]: E0930 05:44:39.948660 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4234d793-0813-4224-8b30-754326f3a4e2" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.948677 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4234d793-0813-4224-8b30-754326f3a4e2" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: E0930 05:44:39.948702 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1987b5-707c-4084-a639-b626742fb1a3" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.948711 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1987b5-707c-4084-a639-b626742fb1a3" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: E0930 05:44:39.948726 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78cea6c-ba5c-4c6f-ace8-b815445520ed" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.948733 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78cea6c-ba5c-4c6f-ace8-b815445520ed" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.948918 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4234d793-0813-4224-8b30-754326f3a4e2" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.948937 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b78cea6c-ba5c-4c6f-ace8-b815445520ed" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.948951 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1987b5-707c-4084-a639-b626742fb1a3" containerName="mariadb-account-create" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.949665 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.952226 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mwhll" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.952391 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 05:44:39 crc kubenswrapper[4956]: I0930 05:44:39.961744 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6sh8d"] Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.034002 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-combined-ca-bundle\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.034072 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-config-data\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.034139 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-db-sync-config-data\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.034257 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8drsm\" (UniqueName: \"kubernetes.io/projected/0ff0cad1-ad65-4838-b4d7-b43974fbe477-kube-api-access-8drsm\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.135822 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-combined-ca-bundle\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.135891 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-config-data\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.135936 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-db-sync-config-data\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.136012 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8drsm\" (UniqueName: \"kubernetes.io/projected/0ff0cad1-ad65-4838-b4d7-b43974fbe477-kube-api-access-8drsm\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.140888 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-db-sync-config-data\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.140961 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-combined-ca-bundle\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.147465 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-config-data\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.154629 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8drsm\" (UniqueName: \"kubernetes.io/projected/0ff0cad1-ad65-4838-b4d7-b43974fbe477-kube-api-access-8drsm\") pod \"glance-db-sync-6sh8d\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.217831 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerStarted","Data":"7d209536397595a3a62721c93fdf9641064a7710faa4ab0c47cafd7144855ca9"} Sep 30 05:44:40 crc kubenswrapper[4956]: I0930 05:44:40.301860 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6sh8d" Sep 30 05:44:41 crc kubenswrapper[4956]: I0930 05:44:40.809411 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6sh8d"] Sep 30 05:44:41 crc kubenswrapper[4956]: I0930 05:44:40.887292 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 05:44:41 crc kubenswrapper[4956]: I0930 05:44:41.227594 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6sh8d" event={"ID":"0ff0cad1-ad65-4838-b4d7-b43974fbe477","Type":"ContainerStarted","Data":"9f7199404b01e95d8e91a7f6c1b560d49a7d63dd60580fe78a38f811dbe8401f"} Sep 30 05:44:44 crc kubenswrapper[4956]: I0930 05:44:44.693465 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-r7wfs" podUID="611676cd-11d2-44c4-bae2-41b6b22f898d" containerName="ovn-controller" probeResult="failure" output=< Sep 30 05:44:44 crc kubenswrapper[4956]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 05:44:44 crc kubenswrapper[4956]: > Sep 30 05:44:44 crc kubenswrapper[4956]: I0930 05:44:44.738775 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:44:44 crc kubenswrapper[4956]: I0930 05:44:44.750286 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p9lvd" Sep 30 05:44:44 crc kubenswrapper[4956]: I0930 05:44:44.967861 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r7wfs-config-2lkgj"] Sep 30 05:44:44 crc kubenswrapper[4956]: I0930 05:44:44.969506 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:44 crc kubenswrapper[4956]: I0930 05:44:44.973453 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 05:44:44 crc kubenswrapper[4956]: I0930 05:44:44.984405 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r7wfs-config-2lkgj"] Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.012490 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-log-ovn\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.012540 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfp97\" (UniqueName: \"kubernetes.io/projected/519da563-0852-472a-ba3c-ccf1e6acea16-kube-api-access-wfp97\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.012561 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.012590 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run-ovn\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.012636 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-scripts\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.012653 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-additional-scripts\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114337 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-scripts\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114383 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-additional-scripts\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114460 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-log-ovn\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114480 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfp97\" (UniqueName: \"kubernetes.io/projected/519da563-0852-472a-ba3c-ccf1e6acea16-kube-api-access-wfp97\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114505 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114544 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run-ovn\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114809 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run-ovn\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114814 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.114877 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-log-ovn\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.140007 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-additional-scripts\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.141277 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-scripts\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.153941 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfp97\" (UniqueName: \"kubernetes.io/projected/519da563-0852-472a-ba3c-ccf1e6acea16-kube-api-access-wfp97\") pod \"ovn-controller-r7wfs-config-2lkgj\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.310288 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:45 crc kubenswrapper[4956]: I0930 05:44:45.779703 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r7wfs-config-2lkgj"] Sep 30 05:44:46 crc kubenswrapper[4956]: I0930 05:44:46.264432 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs-config-2lkgj" event={"ID":"519da563-0852-472a-ba3c-ccf1e6acea16","Type":"ContainerStarted","Data":"509e3037099b2a742bd50bc91ebd9ee374dec1310ab499be8d8ba222635ab076"} Sep 30 05:44:46 crc kubenswrapper[4956]: I0930 05:44:46.264675 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs-config-2lkgj" event={"ID":"519da563-0852-472a-ba3c-ccf1e6acea16","Type":"ContainerStarted","Data":"7356367315ed34ef74c5a2b492365ab6e881c0516c508d3e1879b1e126472116"} Sep 30 05:44:46 crc kubenswrapper[4956]: I0930 05:44:46.292032 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r7wfs-config-2lkgj" podStartSLOduration=2.292011968 podStartE2EDuration="2.292011968s" podCreationTimestamp="2025-09-30 05:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:46.286827854 +0000 UTC m=+956.613948379" watchObservedRunningTime="2025-09-30 05:44:46.292011968 +0000 UTC m=+956.619132493" Sep 30 05:44:46 crc kubenswrapper[4956]: I0930 05:44:46.939271 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:44:46 crc kubenswrapper[4956]: I0930 05:44:46.990651 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-f688r"] Sep 30 05:44:46 crc kubenswrapper[4956]: I0930 05:44:46.990900 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" podUID="44991506-83b3-4752-8cc3-405ab0347a3f" containerName="dnsmasq-dns" containerID="cri-o://522b4001a1ee49e91d4c452cee6d7d6d57c6eb90ee9796861627746a53a04e7d" gracePeriod=10 Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.301220 4956 generic.go:334] "Generic (PLEG): container finished" podID="44991506-83b3-4752-8cc3-405ab0347a3f" containerID="522b4001a1ee49e91d4c452cee6d7d6d57c6eb90ee9796861627746a53a04e7d" exitCode=0 Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.301829 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" event={"ID":"44991506-83b3-4752-8cc3-405ab0347a3f","Type":"ContainerDied","Data":"522b4001a1ee49e91d4c452cee6d7d6d57c6eb90ee9796861627746a53a04e7d"} Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.305441 4956 generic.go:334] "Generic (PLEG): container finished" podID="519da563-0852-472a-ba3c-ccf1e6acea16" containerID="509e3037099b2a742bd50bc91ebd9ee374dec1310ab499be8d8ba222635ab076" exitCode=0 Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.305531 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs-config-2lkgj" event={"ID":"519da563-0852-472a-ba3c-ccf1e6acea16","Type":"ContainerDied","Data":"509e3037099b2a742bd50bc91ebd9ee374dec1310ab499be8d8ba222635ab076"} Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.309554 4956 generic.go:334] "Generic (PLEG): container finished" podID="de710cda-e9a2-426f-a617-2ac08ef16386" containerID="7d209536397595a3a62721c93fdf9641064a7710faa4ab0c47cafd7144855ca9" exitCode=0 Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.309616 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerDied","Data":"7d209536397595a3a62721c93fdf9641064a7710faa4ab0c47cafd7144855ca9"} Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.541424 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.654421 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-nb\") pod \"44991506-83b3-4752-8cc3-405ab0347a3f\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.654469 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-sb\") pod \"44991506-83b3-4752-8cc3-405ab0347a3f\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.654556 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79q5c\" (UniqueName: \"kubernetes.io/projected/44991506-83b3-4752-8cc3-405ab0347a3f-kube-api-access-79q5c\") pod \"44991506-83b3-4752-8cc3-405ab0347a3f\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.654648 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-dns-svc\") pod \"44991506-83b3-4752-8cc3-405ab0347a3f\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.654690 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-config\") pod \"44991506-83b3-4752-8cc3-405ab0347a3f\" (UID: \"44991506-83b3-4752-8cc3-405ab0347a3f\") " Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.663025 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44991506-83b3-4752-8cc3-405ab0347a3f-kube-api-access-79q5c" (OuterVolumeSpecName: "kube-api-access-79q5c") pod "44991506-83b3-4752-8cc3-405ab0347a3f" (UID: "44991506-83b3-4752-8cc3-405ab0347a3f"). InnerVolumeSpecName "kube-api-access-79q5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.714138 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44991506-83b3-4752-8cc3-405ab0347a3f" (UID: "44991506-83b3-4752-8cc3-405ab0347a3f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.724103 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-config" (OuterVolumeSpecName: "config") pod "44991506-83b3-4752-8cc3-405ab0347a3f" (UID: "44991506-83b3-4752-8cc3-405ab0347a3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.738835 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "44991506-83b3-4752-8cc3-405ab0347a3f" (UID: "44991506-83b3-4752-8cc3-405ab0347a3f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.747167 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44991506-83b3-4752-8cc3-405ab0347a3f" (UID: "44991506-83b3-4752-8cc3-405ab0347a3f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.757461 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.757492 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.757501 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.757512 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44991506-83b3-4752-8cc3-405ab0347a3f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:47 crc kubenswrapper[4956]: I0930 05:44:47.757523 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79q5c\" (UniqueName: \"kubernetes.io/projected/44991506-83b3-4752-8cc3-405ab0347a3f-kube-api-access-79q5c\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.322783 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerStarted","Data":"19c74458ef26b83bc07dab14c067ef515cc7ce8a2ed066c8d46e4a9a1b05745b"} Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.326159 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.333575 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-f688r" event={"ID":"44991506-83b3-4752-8cc3-405ab0347a3f","Type":"ContainerDied","Data":"77c906702dd6a1b497a8f0e90a9b720eeb13a7d0bde979da594b4b0d159ddd5c"} Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.333677 4956 scope.go:117] "RemoveContainer" containerID="522b4001a1ee49e91d4c452cee6d7d6d57c6eb90ee9796861627746a53a04e7d" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.368298 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-f688r"] Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.372213 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-f688r"] Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.379454 4956 scope.go:117] "RemoveContainer" containerID="6277390e1a01f01097b5d142ff84d6e16ab17d6b73b27122ac7ca1a0b0b9440b" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.691950 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.874617 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run\") pod \"519da563-0852-472a-ba3c-ccf1e6acea16\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.874881 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-log-ovn\") pod \"519da563-0852-472a-ba3c-ccf1e6acea16\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.875059 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-additional-scripts\") pod \"519da563-0852-472a-ba3c-ccf1e6acea16\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.874710 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run" (OuterVolumeSpecName: "var-run") pod "519da563-0852-472a-ba3c-ccf1e6acea16" (UID: "519da563-0852-472a-ba3c-ccf1e6acea16"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.875212 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "519da563-0852-472a-ba3c-ccf1e6acea16" (UID: "519da563-0852-472a-ba3c-ccf1e6acea16"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.875361 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-scripts\") pod \"519da563-0852-472a-ba3c-ccf1e6acea16\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.875454 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfp97\" (UniqueName: \"kubernetes.io/projected/519da563-0852-472a-ba3c-ccf1e6acea16-kube-api-access-wfp97\") pod \"519da563-0852-472a-ba3c-ccf1e6acea16\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.875561 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run-ovn\") pod \"519da563-0852-472a-ba3c-ccf1e6acea16\" (UID: \"519da563-0852-472a-ba3c-ccf1e6acea16\") " Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.876006 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "519da563-0852-472a-ba3c-ccf1e6acea16" (UID: "519da563-0852-472a-ba3c-ccf1e6acea16"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.876270 4956 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.876363 4956 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.876428 4956 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.876794 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-scripts" (OuterVolumeSpecName: "scripts") pod "519da563-0852-472a-ba3c-ccf1e6acea16" (UID: "519da563-0852-472a-ba3c-ccf1e6acea16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.876834 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "519da563-0852-472a-ba3c-ccf1e6acea16" (UID: "519da563-0852-472a-ba3c-ccf1e6acea16"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.882294 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519da563-0852-472a-ba3c-ccf1e6acea16-kube-api-access-wfp97" (OuterVolumeSpecName: "kube-api-access-wfp97") pod "519da563-0852-472a-ba3c-ccf1e6acea16" (UID: "519da563-0852-472a-ba3c-ccf1e6acea16"). InnerVolumeSpecName "kube-api-access-wfp97". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.977647 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfp97\" (UniqueName: \"kubernetes.io/projected/519da563-0852-472a-ba3c-ccf1e6acea16-kube-api-access-wfp97\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.977677 4956 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/519da563-0852-472a-ba3c-ccf1e6acea16-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:48 crc kubenswrapper[4956]: I0930 05:44:48.977687 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/519da563-0852-472a-ba3c-ccf1e6acea16-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.339101 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs-config-2lkgj" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.339150 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs-config-2lkgj" event={"ID":"519da563-0852-472a-ba3c-ccf1e6acea16","Type":"ContainerDied","Data":"7356367315ed34ef74c5a2b492365ab6e881c0516c508d3e1879b1e126472116"} Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.339182 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7356367315ed34ef74c5a2b492365ab6e881c0516c508d3e1879b1e126472116" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.414953 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r7wfs-config-2lkgj"] Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.421701 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r7wfs-config-2lkgj"] Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.513822 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r7wfs-config-kg4q8"] Sep 30 05:44:49 crc kubenswrapper[4956]: E0930 05:44:49.514179 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44991506-83b3-4752-8cc3-405ab0347a3f" containerName="dnsmasq-dns" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.514198 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="44991506-83b3-4752-8cc3-405ab0347a3f" containerName="dnsmasq-dns" Sep 30 05:44:49 crc kubenswrapper[4956]: E0930 05:44:49.514230 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519da563-0852-472a-ba3c-ccf1e6acea16" containerName="ovn-config" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.514238 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="519da563-0852-472a-ba3c-ccf1e6acea16" containerName="ovn-config" Sep 30 05:44:49 crc kubenswrapper[4956]: E0930 05:44:49.514259 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44991506-83b3-4752-8cc3-405ab0347a3f" containerName="init" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.514267 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="44991506-83b3-4752-8cc3-405ab0347a3f" containerName="init" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.514418 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="44991506-83b3-4752-8cc3-405ab0347a3f" containerName="dnsmasq-dns" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.514449 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="519da563-0852-472a-ba3c-ccf1e6acea16" containerName="ovn-config" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.515024 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.518748 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.586278 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-additional-scripts\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.586336 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cqkm\" (UniqueName: \"kubernetes.io/projected/5361904b-3c30-48b4-8cbf-1838d7b8346e-kube-api-access-8cqkm\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.586362 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run-ovn\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.586543 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-log-ovn\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.586588 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-scripts\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.586725 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.623528 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r7wfs-config-kg4q8"] Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687364 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-log-ovn\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687416 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-scripts\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687477 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687580 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-additional-scripts\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687608 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-log-ovn\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687609 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cqkm\" (UniqueName: \"kubernetes.io/projected/5361904b-3c30-48b4-8cbf-1838d7b8346e-kube-api-access-8cqkm\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687653 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run-ovn\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687727 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run-ovn\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.687778 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.688388 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-additional-scripts\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.689159 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-scripts\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.771308 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cqkm\" (UniqueName: \"kubernetes.io/projected/5361904b-3c30-48b4-8cbf-1838d7b8346e-kube-api-access-8cqkm\") pod \"ovn-controller-r7wfs-config-kg4q8\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.856956 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-r7wfs" Sep 30 05:44:49 crc kubenswrapper[4956]: I0930 05:44:49.859911 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.357901 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44991506-83b3-4752-8cc3-405ab0347a3f" path="/var/lib/kubelet/pods/44991506-83b3-4752-8cc3-405ab0347a3f/volumes" Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.359569 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="519da563-0852-472a-ba3c-ccf1e6acea16" path="/var/lib/kubelet/pods/519da563-0852-472a-ba3c-ccf1e6acea16/volumes" Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.360374 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r7wfs-config-kg4q8"] Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.360423 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerStarted","Data":"97f6c0624224989180f4175a911599b14468c1c98c87cd5acfbfb0bfaeb53233"} Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.360445 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerStarted","Data":"83eb93e1d04ae7cee3cd63476ad6e49e021a57bbe3ff75ebba7a0fddd5c8e5ea"} Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.412602 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.412586564 podStartE2EDuration="14.412586564s" podCreationTimestamp="2025-09-30 05:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:44:50.409793025 +0000 UTC m=+960.736913570" watchObservedRunningTime="2025-09-30 05:44:50.412586564 +0000 UTC m=+960.739707089" Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.598290 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.658396 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Sep 30 05:44:50 crc kubenswrapper[4956]: I0930 05:44:50.863554 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="de3a8c94-71b5-4948-9079-cc7009b9a8ea" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Sep 30 05:44:51 crc kubenswrapper[4956]: I0930 05:44:51.368510 4956 generic.go:334] "Generic (PLEG): container finished" podID="5361904b-3c30-48b4-8cbf-1838d7b8346e" containerID="8ee40a03262790de3fbb7f3871a285f616d4b453cac08f4902675e2c34bc2937" exitCode=0 Sep 30 05:44:51 crc kubenswrapper[4956]: I0930 05:44:51.368623 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs-config-kg4q8" event={"ID":"5361904b-3c30-48b4-8cbf-1838d7b8346e","Type":"ContainerDied","Data":"8ee40a03262790de3fbb7f3871a285f616d4b453cac08f4902675e2c34bc2937"} Sep 30 05:44:51 crc kubenswrapper[4956]: I0930 05:44:51.368717 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs-config-kg4q8" event={"ID":"5361904b-3c30-48b4-8cbf-1838d7b8346e","Type":"ContainerStarted","Data":"08f2094b896a2b62d46333cde831e40e93f8a5f78b8193be21edad0932f08b45"} Sep 30 05:44:51 crc kubenswrapper[4956]: I0930 05:44:51.832977 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:51 crc kubenswrapper[4956]: I0930 05:44:51.833242 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:51 crc kubenswrapper[4956]: I0930 05:44:51.840937 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:52 crc kubenswrapper[4956]: I0930 05:44:52.378972 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.661476 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764129 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-log-ovn\") pod \"5361904b-3c30-48b4-8cbf-1838d7b8346e\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764171 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run\") pod \"5361904b-3c30-48b4-8cbf-1838d7b8346e\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764198 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-scripts\") pod \"5361904b-3c30-48b4-8cbf-1838d7b8346e\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764215 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run-ovn\") pod \"5361904b-3c30-48b4-8cbf-1838d7b8346e\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764318 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-additional-scripts\") pod \"5361904b-3c30-48b4-8cbf-1838d7b8346e\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764319 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run" (OuterVolumeSpecName: "var-run") pod "5361904b-3c30-48b4-8cbf-1838d7b8346e" (UID: "5361904b-3c30-48b4-8cbf-1838d7b8346e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764345 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cqkm\" (UniqueName: \"kubernetes.io/projected/5361904b-3c30-48b4-8cbf-1838d7b8346e-kube-api-access-8cqkm\") pod \"5361904b-3c30-48b4-8cbf-1838d7b8346e\" (UID: \"5361904b-3c30-48b4-8cbf-1838d7b8346e\") " Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764376 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5361904b-3c30-48b4-8cbf-1838d7b8346e" (UID: "5361904b-3c30-48b4-8cbf-1838d7b8346e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764608 4956 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764624 4956 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.764606 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5361904b-3c30-48b4-8cbf-1838d7b8346e" (UID: "5361904b-3c30-48b4-8cbf-1838d7b8346e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.765091 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5361904b-3c30-48b4-8cbf-1838d7b8346e" (UID: "5361904b-3c30-48b4-8cbf-1838d7b8346e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.765530 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-scripts" (OuterVolumeSpecName: "scripts") pod "5361904b-3c30-48b4-8cbf-1838d7b8346e" (UID: "5361904b-3c30-48b4-8cbf-1838d7b8346e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.767893 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5361904b-3c30-48b4-8cbf-1838d7b8346e-kube-api-access-8cqkm" (OuterVolumeSpecName: "kube-api-access-8cqkm") pod "5361904b-3c30-48b4-8cbf-1838d7b8346e" (UID: "5361904b-3c30-48b4-8cbf-1838d7b8346e"). InnerVolumeSpecName "kube-api-access-8cqkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.866497 4956 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5361904b-3c30-48b4-8cbf-1838d7b8346e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.866541 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.866559 4956 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5361904b-3c30-48b4-8cbf-1838d7b8346e-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:57 crc kubenswrapper[4956]: I0930 05:44:57.866576 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cqkm\" (UniqueName: \"kubernetes.io/projected/5361904b-3c30-48b4-8cbf-1838d7b8346e-kube-api-access-8cqkm\") on node \"crc\" DevicePath \"\"" Sep 30 05:44:58 crc kubenswrapper[4956]: I0930 05:44:58.425829 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r7wfs-config-kg4q8" Sep 30 05:44:58 crc kubenswrapper[4956]: I0930 05:44:58.425866 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r7wfs-config-kg4q8" event={"ID":"5361904b-3c30-48b4-8cbf-1838d7b8346e","Type":"ContainerDied","Data":"08f2094b896a2b62d46333cde831e40e93f8a5f78b8193be21edad0932f08b45"} Sep 30 05:44:58 crc kubenswrapper[4956]: I0930 05:44:58.426251 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08f2094b896a2b62d46333cde831e40e93f8a5f78b8193be21edad0932f08b45" Sep 30 05:44:58 crc kubenswrapper[4956]: I0930 05:44:58.428455 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6sh8d" event={"ID":"0ff0cad1-ad65-4838-b4d7-b43974fbe477","Type":"ContainerStarted","Data":"28fc84e7e1f8da81e6de8ac31305b2590c17a3be728b641677bdbea2830e17fd"} Sep 30 05:44:58 crc kubenswrapper[4956]: I0930 05:44:58.447718 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6sh8d" podStartSLOduration=2.574016514 podStartE2EDuration="19.447671583s" podCreationTimestamp="2025-09-30 05:44:39 +0000 UTC" firstStartedPulling="2025-09-30 05:44:40.815946234 +0000 UTC m=+951.143066759" lastFinishedPulling="2025-09-30 05:44:57.689601303 +0000 UTC m=+968.016721828" observedRunningTime="2025-09-30 05:44:58.443678127 +0000 UTC m=+968.770798662" watchObservedRunningTime="2025-09-30 05:44:58.447671583 +0000 UTC m=+968.774792108" Sep 30 05:44:58 crc kubenswrapper[4956]: I0930 05:44:58.750437 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r7wfs-config-kg4q8"] Sep 30 05:44:58 crc kubenswrapper[4956]: I0930 05:44:58.758458 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r7wfs-config-kg4q8"] Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.136238 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk"] Sep 30 05:45:00 crc kubenswrapper[4956]: E0930 05:45:00.136888 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5361904b-3c30-48b4-8cbf-1838d7b8346e" containerName="ovn-config" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.136904 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5361904b-3c30-48b4-8cbf-1838d7b8346e" containerName="ovn-config" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.137130 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5361904b-3c30-48b4-8cbf-1838d7b8346e" containerName="ovn-config" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.137809 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.141046 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.141925 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.157555 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk"] Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.205996 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnt4f\" (UniqueName: \"kubernetes.io/projected/017b2353-9d72-4844-a21b-9dce833ae063-kube-api-access-mnt4f\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.206088 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/017b2353-9d72-4844-a21b-9dce833ae063-secret-volume\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.206240 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/017b2353-9d72-4844-a21b-9dce833ae063-config-volume\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.307798 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/017b2353-9d72-4844-a21b-9dce833ae063-config-volume\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.307900 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnt4f\" (UniqueName: \"kubernetes.io/projected/017b2353-9d72-4844-a21b-9dce833ae063-kube-api-access-mnt4f\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.307943 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/017b2353-9d72-4844-a21b-9dce833ae063-secret-volume\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.308719 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/017b2353-9d72-4844-a21b-9dce833ae063-config-volume\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.318099 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/017b2353-9d72-4844-a21b-9dce833ae063-secret-volume\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.324219 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnt4f\" (UniqueName: \"kubernetes.io/projected/017b2353-9d72-4844-a21b-9dce833ae063-kube-api-access-mnt4f\") pod \"collect-profiles-29320185-xmjtk\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.351320 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5361904b-3c30-48b4-8cbf-1838d7b8346e" path="/var/lib/kubelet/pods/5361904b-3c30-48b4-8cbf-1838d7b8346e/volumes" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.469277 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.599198 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.660738 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.879438 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bk28v"] Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.880617 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.880967 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bk28v" Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.909767 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bk28v"] Sep 30 05:45:00 crc kubenswrapper[4956]: I0930 05:45:00.987877 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk"] Sep 30 05:45:01 crc kubenswrapper[4956]: W0930 05:45:01.015183 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017b2353_9d72_4844_a21b_9dce833ae063.slice/crio-b3dad676478c53299dc68ae1cda11dc41f317fed278d65e59c8a6c7a4089265a WatchSource:0}: Error finding container b3dad676478c53299dc68ae1cda11dc41f317fed278d65e59c8a6c7a4089265a: Status 404 returned error can't find the container with id b3dad676478c53299dc68ae1cda11dc41f317fed278d65e59c8a6c7a4089265a Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.021977 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxsh\" (UniqueName: \"kubernetes.io/projected/b14fc082-107c-44e4-b2c2-314d9b8010c7-kube-api-access-ndxsh\") pod \"barbican-db-create-bk28v\" (UID: \"b14fc082-107c-44e4-b2c2-314d9b8010c7\") " pod="openstack/barbican-db-create-bk28v" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.080181 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ks9m6"] Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.081610 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ks9m6" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.099871 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ks9m6"] Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.124327 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxsh\" (UniqueName: \"kubernetes.io/projected/b14fc082-107c-44e4-b2c2-314d9b8010c7-kube-api-access-ndxsh\") pod \"barbican-db-create-bk28v\" (UID: \"b14fc082-107c-44e4-b2c2-314d9b8010c7\") " pod="openstack/barbican-db-create-bk28v" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.148842 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxsh\" (UniqueName: \"kubernetes.io/projected/b14fc082-107c-44e4-b2c2-314d9b8010c7-kube-api-access-ndxsh\") pod \"barbican-db-create-bk28v\" (UID: \"b14fc082-107c-44e4-b2c2-314d9b8010c7\") " pod="openstack/barbican-db-create-bk28v" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.214573 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bk28v" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.215844 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-tkh7q"] Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.217173 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.219558 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.222553 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.223108 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n4lp7" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.223235 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.229224 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tgbq\" (UniqueName: \"kubernetes.io/projected/dc066a4f-bea1-47c4-96c4-f3dcda1a930d-kube-api-access-4tgbq\") pod \"cinder-db-create-ks9m6\" (UID: \"dc066a4f-bea1-47c4-96c4-f3dcda1a930d\") " pod="openstack/cinder-db-create-ks9m6" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.229592 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tkh7q"] Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.258107 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-c75th"] Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.259090 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c75th" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.304279 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-c75th"] Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.340822 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6rc2\" (UniqueName: \"kubernetes.io/projected/f51c959c-848b-46f3-a128-1604dd5fc435-kube-api-access-x6rc2\") pod \"neutron-db-create-c75th\" (UID: \"f51c959c-848b-46f3-a128-1604dd5fc435\") " pod="openstack/neutron-db-create-c75th" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.341079 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-combined-ca-bundle\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.341144 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqsz8\" (UniqueName: \"kubernetes.io/projected/826ffd10-134e-452b-93de-3a9bc469d44d-kube-api-access-qqsz8\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.341191 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tgbq\" (UniqueName: \"kubernetes.io/projected/dc066a4f-bea1-47c4-96c4-f3dcda1a930d-kube-api-access-4tgbq\") pod \"cinder-db-create-ks9m6\" (UID: \"dc066a4f-bea1-47c4-96c4-f3dcda1a930d\") " pod="openstack/cinder-db-create-ks9m6" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.341228 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-config-data\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.390214 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tgbq\" (UniqueName: \"kubernetes.io/projected/dc066a4f-bea1-47c4-96c4-f3dcda1a930d-kube-api-access-4tgbq\") pod \"cinder-db-create-ks9m6\" (UID: \"dc066a4f-bea1-47c4-96c4-f3dcda1a930d\") " pod="openstack/cinder-db-create-ks9m6" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.416837 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ks9m6" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.443240 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6rc2\" (UniqueName: \"kubernetes.io/projected/f51c959c-848b-46f3-a128-1604dd5fc435-kube-api-access-x6rc2\") pod \"neutron-db-create-c75th\" (UID: \"f51c959c-848b-46f3-a128-1604dd5fc435\") " pod="openstack/neutron-db-create-c75th" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.443375 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-combined-ca-bundle\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.443401 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqsz8\" (UniqueName: \"kubernetes.io/projected/826ffd10-134e-452b-93de-3a9bc469d44d-kube-api-access-qqsz8\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.443460 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-config-data\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.449762 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-config-data\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.450813 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-combined-ca-bundle\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.474074 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6rc2\" (UniqueName: \"kubernetes.io/projected/f51c959c-848b-46f3-a128-1604dd5fc435-kube-api-access-x6rc2\") pod \"neutron-db-create-c75th\" (UID: \"f51c959c-848b-46f3-a128-1604dd5fc435\") " pod="openstack/neutron-db-create-c75th" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.478014 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqsz8\" (UniqueName: \"kubernetes.io/projected/826ffd10-134e-452b-93de-3a9bc469d44d-kube-api-access-qqsz8\") pod \"keystone-db-sync-tkh7q\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.479994 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" event={"ID":"017b2353-9d72-4844-a21b-9dce833ae063","Type":"ContainerStarted","Data":"1f5a2fd296b771e30997d3629edd94c876285fe1b5e3c9d763d3b93267c3b487"} Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.480037 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" event={"ID":"017b2353-9d72-4844-a21b-9dce833ae063","Type":"ContainerStarted","Data":"b3dad676478c53299dc68ae1cda11dc41f317fed278d65e59c8a6c7a4089265a"} Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.579896 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.593060 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c75th" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.790607 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" podStartSLOduration=1.790550997 podStartE2EDuration="1.790550997s" podCreationTimestamp="2025-09-30 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:01.501599512 +0000 UTC m=+971.828720037" watchObservedRunningTime="2025-09-30 05:45:01.790550997 +0000 UTC m=+972.117671522" Sep 30 05:45:01 crc kubenswrapper[4956]: I0930 05:45:01.794239 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bk28v"] Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.017351 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ks9m6"] Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.199975 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tkh7q"] Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.210502 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-c75th"] Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.491948 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tkh7q" event={"ID":"826ffd10-134e-452b-93de-3a9bc469d44d","Type":"ContainerStarted","Data":"d9d100843c02f8f5cb33980511442549c074d44b242bbd8619bbe7e6e82424a4"} Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.493349 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ks9m6" event={"ID":"dc066a4f-bea1-47c4-96c4-f3dcda1a930d","Type":"ContainerStarted","Data":"6b1ce41b85219be5ebfdd860a50ec71709c6b70955b9125cbbd0a0015f100a6e"} Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.494967 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c75th" event={"ID":"f51c959c-848b-46f3-a128-1604dd5fc435","Type":"ContainerStarted","Data":"3d88330b8cd482a692500feab85b22883cdfcacca870db3db9b71b65f3365d91"} Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.496860 4956 generic.go:334] "Generic (PLEG): container finished" podID="017b2353-9d72-4844-a21b-9dce833ae063" containerID="1f5a2fd296b771e30997d3629edd94c876285fe1b5e3c9d763d3b93267c3b487" exitCode=0 Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.496898 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" event={"ID":"017b2353-9d72-4844-a21b-9dce833ae063","Type":"ContainerDied","Data":"1f5a2fd296b771e30997d3629edd94c876285fe1b5e3c9d763d3b93267c3b487"} Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.498888 4956 generic.go:334] "Generic (PLEG): container finished" podID="b14fc082-107c-44e4-b2c2-314d9b8010c7" containerID="6db3d0166ba7a8f7de8a258c636eecb483ebfa0b0bdfd0727ca2b4772507c5c2" exitCode=0 Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.498926 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bk28v" event={"ID":"b14fc082-107c-44e4-b2c2-314d9b8010c7","Type":"ContainerDied","Data":"6db3d0166ba7a8f7de8a258c636eecb483ebfa0b0bdfd0727ca2b4772507c5c2"} Sep 30 05:45:02 crc kubenswrapper[4956]: I0930 05:45:02.498942 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bk28v" event={"ID":"b14fc082-107c-44e4-b2c2-314d9b8010c7","Type":"ContainerStarted","Data":"fd987777bd91f9fdd3aaacce271f3705baa4ae0a628188c5023ef85645353dbd"} Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.708273 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-24d8t"] Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.711088 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.713184 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.713570 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-4h2b6" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.717349 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-24d8t"] Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.785795 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-combined-ca-bundle\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.785850 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9w7\" (UniqueName: \"kubernetes.io/projected/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-kube-api-access-jg9w7\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.785889 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-db-sync-config-data\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.785973 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-config-data\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.889220 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-config-data\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.889330 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-combined-ca-bundle\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.889373 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9w7\" (UniqueName: \"kubernetes.io/projected/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-kube-api-access-jg9w7\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.889424 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-db-sync-config-data\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.895550 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-config-data\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.895979 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-combined-ca-bundle\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.897613 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-db-sync-config-data\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:03 crc kubenswrapper[4956]: I0930 05:45:03.926719 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9w7\" (UniqueName: \"kubernetes.io/projected/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-kube-api-access-jg9w7\") pod \"watcher-db-sync-24d8t\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.036126 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.183401 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.238127 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bk28v" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.294606 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnt4f\" (UniqueName: \"kubernetes.io/projected/017b2353-9d72-4844-a21b-9dce833ae063-kube-api-access-mnt4f\") pod \"017b2353-9d72-4844-a21b-9dce833ae063\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.294733 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/017b2353-9d72-4844-a21b-9dce833ae063-config-volume\") pod \"017b2353-9d72-4844-a21b-9dce833ae063\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.294783 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndxsh\" (UniqueName: \"kubernetes.io/projected/b14fc082-107c-44e4-b2c2-314d9b8010c7-kube-api-access-ndxsh\") pod \"b14fc082-107c-44e4-b2c2-314d9b8010c7\" (UID: \"b14fc082-107c-44e4-b2c2-314d9b8010c7\") " Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.294817 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/017b2353-9d72-4844-a21b-9dce833ae063-secret-volume\") pod \"017b2353-9d72-4844-a21b-9dce833ae063\" (UID: \"017b2353-9d72-4844-a21b-9dce833ae063\") " Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.296310 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017b2353-9d72-4844-a21b-9dce833ae063-config-volume" (OuterVolumeSpecName: "config-volume") pod "017b2353-9d72-4844-a21b-9dce833ae063" (UID: "017b2353-9d72-4844-a21b-9dce833ae063"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.306323 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017b2353-9d72-4844-a21b-9dce833ae063-kube-api-access-mnt4f" (OuterVolumeSpecName: "kube-api-access-mnt4f") pod "017b2353-9d72-4844-a21b-9dce833ae063" (UID: "017b2353-9d72-4844-a21b-9dce833ae063"). InnerVolumeSpecName "kube-api-access-mnt4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.315849 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14fc082-107c-44e4-b2c2-314d9b8010c7-kube-api-access-ndxsh" (OuterVolumeSpecName: "kube-api-access-ndxsh") pod "b14fc082-107c-44e4-b2c2-314d9b8010c7" (UID: "b14fc082-107c-44e4-b2c2-314d9b8010c7"). InnerVolumeSpecName "kube-api-access-ndxsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.321842 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017b2353-9d72-4844-a21b-9dce833ae063-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "017b2353-9d72-4844-a21b-9dce833ae063" (UID: "017b2353-9d72-4844-a21b-9dce833ae063"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.397020 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/017b2353-9d72-4844-a21b-9dce833ae063-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.397304 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndxsh\" (UniqueName: \"kubernetes.io/projected/b14fc082-107c-44e4-b2c2-314d9b8010c7-kube-api-access-ndxsh\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.397352 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/017b2353-9d72-4844-a21b-9dce833ae063-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.397377 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnt4f\" (UniqueName: \"kubernetes.io/projected/017b2353-9d72-4844-a21b-9dce833ae063-kube-api-access-mnt4f\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.517401 4956 generic.go:334] "Generic (PLEG): container finished" podID="f51c959c-848b-46f3-a128-1604dd5fc435" containerID="0caea1a43ed4362bbcff0cb4946f5fe971ff1378fb1c48fd54c8129c5feabb23" exitCode=0 Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.517478 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c75th" event={"ID":"f51c959c-848b-46f3-a128-1604dd5fc435","Type":"ContainerDied","Data":"0caea1a43ed4362bbcff0cb4946f5fe971ff1378fb1c48fd54c8129c5feabb23"} Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.520607 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.521194 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk" event={"ID":"017b2353-9d72-4844-a21b-9dce833ae063","Type":"ContainerDied","Data":"b3dad676478c53299dc68ae1cda11dc41f317fed278d65e59c8a6c7a4089265a"} Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.521213 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3dad676478c53299dc68ae1cda11dc41f317fed278d65e59c8a6c7a4089265a" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.525020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bk28v" event={"ID":"b14fc082-107c-44e4-b2c2-314d9b8010c7","Type":"ContainerDied","Data":"fd987777bd91f9fdd3aaacce271f3705baa4ae0a628188c5023ef85645353dbd"} Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.525066 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd987777bd91f9fdd3aaacce271f3705baa4ae0a628188c5023ef85645353dbd" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.525138 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bk28v" Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.531067 4956 generic.go:334] "Generic (PLEG): container finished" podID="dc066a4f-bea1-47c4-96c4-f3dcda1a930d" containerID="5f6f01c705c6103866b94d4b909e757392136708c6914d11166f55f72f036861" exitCode=0 Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.531107 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ks9m6" event={"ID":"dc066a4f-bea1-47c4-96c4-f3dcda1a930d","Type":"ContainerDied","Data":"5f6f01c705c6103866b94d4b909e757392136708c6914d11166f55f72f036861"} Sep 30 05:45:04 crc kubenswrapper[4956]: I0930 05:45:04.583473 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-24d8t"] Sep 30 05:45:04 crc kubenswrapper[4956]: W0930 05:45:04.584354 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde9dd57e_7978_4d30_b9ec_a3894e9d67e9.slice/crio-0c39f4268725b531d4869c1a7d517bb6a2ab8d70f4285234770820df6e66e349 WatchSource:0}: Error finding container 0c39f4268725b531d4869c1a7d517bb6a2ab8d70f4285234770820df6e66e349: Status 404 returned error can't find the container with id 0c39f4268725b531d4869c1a7d517bb6a2ab8d70f4285234770820df6e66e349 Sep 30 05:45:05 crc kubenswrapper[4956]: I0930 05:45:05.557932 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-24d8t" event={"ID":"de9dd57e-7978-4d30-b9ec-a3894e9d67e9","Type":"ContainerStarted","Data":"0c39f4268725b531d4869c1a7d517bb6a2ab8d70f4285234770820df6e66e349"} Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.070951 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ks9m6" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.082642 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c75th" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.130811 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tgbq\" (UniqueName: \"kubernetes.io/projected/dc066a4f-bea1-47c4-96c4-f3dcda1a930d-kube-api-access-4tgbq\") pod \"dc066a4f-bea1-47c4-96c4-f3dcda1a930d\" (UID: \"dc066a4f-bea1-47c4-96c4-f3dcda1a930d\") " Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.130933 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6rc2\" (UniqueName: \"kubernetes.io/projected/f51c959c-848b-46f3-a128-1604dd5fc435-kube-api-access-x6rc2\") pod \"f51c959c-848b-46f3-a128-1604dd5fc435\" (UID: \"f51c959c-848b-46f3-a128-1604dd5fc435\") " Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.135882 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc066a4f-bea1-47c4-96c4-f3dcda1a930d-kube-api-access-4tgbq" (OuterVolumeSpecName: "kube-api-access-4tgbq") pod "dc066a4f-bea1-47c4-96c4-f3dcda1a930d" (UID: "dc066a4f-bea1-47c4-96c4-f3dcda1a930d"). InnerVolumeSpecName "kube-api-access-4tgbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.136865 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f51c959c-848b-46f3-a128-1604dd5fc435-kube-api-access-x6rc2" (OuterVolumeSpecName: "kube-api-access-x6rc2") pod "f51c959c-848b-46f3-a128-1604dd5fc435" (UID: "f51c959c-848b-46f3-a128-1604dd5fc435"). InnerVolumeSpecName "kube-api-access-x6rc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.233217 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tgbq\" (UniqueName: \"kubernetes.io/projected/dc066a4f-bea1-47c4-96c4-f3dcda1a930d-kube-api-access-4tgbq\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.233251 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6rc2\" (UniqueName: \"kubernetes.io/projected/f51c959c-848b-46f3-a128-1604dd5fc435-kube-api-access-x6rc2\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.567018 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ks9m6" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.567025 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ks9m6" event={"ID":"dc066a4f-bea1-47c4-96c4-f3dcda1a930d","Type":"ContainerDied","Data":"6b1ce41b85219be5ebfdd860a50ec71709c6b70955b9125cbbd0a0015f100a6e"} Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.567081 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b1ce41b85219be5ebfdd860a50ec71709c6b70955b9125cbbd0a0015f100a6e" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.568570 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c75th" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.568559 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c75th" event={"ID":"f51c959c-848b-46f3-a128-1604dd5fc435","Type":"ContainerDied","Data":"3d88330b8cd482a692500feab85b22883cdfcacca870db3db9b71b65f3365d91"} Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.568716 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d88330b8cd482a692500feab85b22883cdfcacca870db3db9b71b65f3365d91" Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.570059 4956 generic.go:334] "Generic (PLEG): container finished" podID="0ff0cad1-ad65-4838-b4d7-b43974fbe477" containerID="28fc84e7e1f8da81e6de8ac31305b2590c17a3be728b641677bdbea2830e17fd" exitCode=0 Sep 30 05:45:06 crc kubenswrapper[4956]: I0930 05:45:06.570092 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6sh8d" event={"ID":"0ff0cad1-ad65-4838-b4d7-b43974fbe477","Type":"ContainerDied","Data":"28fc84e7e1f8da81e6de8ac31305b2590c17a3be728b641677bdbea2830e17fd"} Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.050224 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6sh8d" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.080998 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8drsm\" (UniqueName: \"kubernetes.io/projected/0ff0cad1-ad65-4838-b4d7-b43974fbe477-kube-api-access-8drsm\") pod \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.081098 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-config-data\") pod \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.081303 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-db-sync-config-data\") pod \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.081341 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-combined-ca-bundle\") pod \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\" (UID: \"0ff0cad1-ad65-4838-b4d7-b43974fbe477\") " Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.088638 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff0cad1-ad65-4838-b4d7-b43974fbe477-kube-api-access-8drsm" (OuterVolumeSpecName: "kube-api-access-8drsm") pod "0ff0cad1-ad65-4838-b4d7-b43974fbe477" (UID: "0ff0cad1-ad65-4838-b4d7-b43974fbe477"). InnerVolumeSpecName "kube-api-access-8drsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.096538 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0ff0cad1-ad65-4838-b4d7-b43974fbe477" (UID: "0ff0cad1-ad65-4838-b4d7-b43974fbe477"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.110912 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ff0cad1-ad65-4838-b4d7-b43974fbe477" (UID: "0ff0cad1-ad65-4838-b4d7-b43974fbe477"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.141544 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-config-data" (OuterVolumeSpecName: "config-data") pod "0ff0cad1-ad65-4838-b4d7-b43974fbe477" (UID: "0ff0cad1-ad65-4838-b4d7-b43974fbe477"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.183085 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8drsm\" (UniqueName: \"kubernetes.io/projected/0ff0cad1-ad65-4838-b4d7-b43974fbe477-kube-api-access-8drsm\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.183134 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.183143 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.183152 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff0cad1-ad65-4838-b4d7-b43974fbe477-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.597271 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tkh7q" event={"ID":"826ffd10-134e-452b-93de-3a9bc469d44d","Type":"ContainerStarted","Data":"18af1d264db565f9694f2705f2a0d5db22f030f2f86aff76c63caaa9f8edecde"} Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.600333 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6sh8d" event={"ID":"0ff0cad1-ad65-4838-b4d7-b43974fbe477","Type":"ContainerDied","Data":"9f7199404b01e95d8e91a7f6c1b560d49a7d63dd60580fe78a38f811dbe8401f"} Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.600364 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f7199404b01e95d8e91a7f6c1b560d49a7d63dd60580fe78a38f811dbe8401f" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.600490 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6sh8d" Sep 30 05:45:09 crc kubenswrapper[4956]: I0930 05:45:09.620603 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-tkh7q" podStartSLOduration=1.774306003 podStartE2EDuration="8.620585403s" podCreationTimestamp="2025-09-30 05:45:01 +0000 UTC" firstStartedPulling="2025-09-30 05:45:02.207513253 +0000 UTC m=+972.534633778" lastFinishedPulling="2025-09-30 05:45:09.053792653 +0000 UTC m=+979.380913178" observedRunningTime="2025-09-30 05:45:09.613140039 +0000 UTC m=+979.940260574" watchObservedRunningTime="2025-09-30 05:45:09.620585403 +0000 UTC m=+979.947705918" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.471546 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dd98f6d57-fztws"] Sep 30 05:45:10 crc kubenswrapper[4956]: E0930 05:45:10.472151 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017b2353-9d72-4844-a21b-9dce833ae063" containerName="collect-profiles" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472164 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="017b2353-9d72-4844-a21b-9dce833ae063" containerName="collect-profiles" Sep 30 05:45:10 crc kubenswrapper[4956]: E0930 05:45:10.472176 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff0cad1-ad65-4838-b4d7-b43974fbe477" containerName="glance-db-sync" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472182 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff0cad1-ad65-4838-b4d7-b43974fbe477" containerName="glance-db-sync" Sep 30 05:45:10 crc kubenswrapper[4956]: E0930 05:45:10.472193 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc066a4f-bea1-47c4-96c4-f3dcda1a930d" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472199 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc066a4f-bea1-47c4-96c4-f3dcda1a930d" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: E0930 05:45:10.472210 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14fc082-107c-44e4-b2c2-314d9b8010c7" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472216 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14fc082-107c-44e4-b2c2-314d9b8010c7" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: E0930 05:45:10.472229 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51c959c-848b-46f3-a128-1604dd5fc435" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472236 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51c959c-848b-46f3-a128-1604dd5fc435" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472408 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff0cad1-ad65-4838-b4d7-b43974fbe477" containerName="glance-db-sync" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472423 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc066a4f-bea1-47c4-96c4-f3dcda1a930d" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472433 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b14fc082-107c-44e4-b2c2-314d9b8010c7" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472445 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f51c959c-848b-46f3-a128-1604dd5fc435" containerName="mariadb-database-create" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.472459 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="017b2353-9d72-4844-a21b-9dce833ae063" containerName="collect-profiles" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.474881 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.492521 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dd98f6d57-fztws"] Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.511315 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.511439 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-config\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.511518 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-svc\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.511546 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.511573 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcw85\" (UniqueName: \"kubernetes.io/projected/bca37042-172f-42db-ac83-85a5872720df-kube-api-access-fcw85\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.511608 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.615131 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-config\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.615227 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-svc\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.615248 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.615264 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcw85\" (UniqueName: \"kubernetes.io/projected/bca37042-172f-42db-ac83-85a5872720df-kube-api-access-fcw85\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.615292 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.615326 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.616332 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.616646 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-config\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.616686 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-svc\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.621245 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.621504 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.635628 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcw85\" (UniqueName: \"kubernetes.io/projected/bca37042-172f-42db-ac83-85a5872720df-kube-api-access-fcw85\") pod \"dnsmasq-dns-5dd98f6d57-fztws\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.798442 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.920544 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-588e-account-create-9zfn6"] Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.922254 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-588e-account-create-9zfn6" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.924077 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 05:45:10 crc kubenswrapper[4956]: I0930 05:45:10.928754 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-588e-account-create-9zfn6"] Sep 30 05:45:11 crc kubenswrapper[4956]: I0930 05:45:11.021210 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlmv5\" (UniqueName: \"kubernetes.io/projected/5cfef647-732a-43ff-ad7d-59b6f1eb1f2a-kube-api-access-vlmv5\") pod \"barbican-588e-account-create-9zfn6\" (UID: \"5cfef647-732a-43ff-ad7d-59b6f1eb1f2a\") " pod="openstack/barbican-588e-account-create-9zfn6" Sep 30 05:45:11 crc kubenswrapper[4956]: I0930 05:45:11.123342 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlmv5\" (UniqueName: \"kubernetes.io/projected/5cfef647-732a-43ff-ad7d-59b6f1eb1f2a-kube-api-access-vlmv5\") pod \"barbican-588e-account-create-9zfn6\" (UID: \"5cfef647-732a-43ff-ad7d-59b6f1eb1f2a\") " pod="openstack/barbican-588e-account-create-9zfn6" Sep 30 05:45:11 crc kubenswrapper[4956]: I0930 05:45:11.139222 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlmv5\" (UniqueName: \"kubernetes.io/projected/5cfef647-732a-43ff-ad7d-59b6f1eb1f2a-kube-api-access-vlmv5\") pod \"barbican-588e-account-create-9zfn6\" (UID: \"5cfef647-732a-43ff-ad7d-59b6f1eb1f2a\") " pod="openstack/barbican-588e-account-create-9zfn6" Sep 30 05:45:11 crc kubenswrapper[4956]: I0930 05:45:11.256172 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-588e-account-create-9zfn6" Sep 30 05:45:14 crc kubenswrapper[4956]: I0930 05:45:14.500354 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dd98f6d57-fztws"] Sep 30 05:45:14 crc kubenswrapper[4956]: W0930 05:45:14.512438 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbca37042_172f_42db_ac83_85a5872720df.slice/crio-dca8055b0fc6b8abf1e85d3cf6ec4a70f9d3fa98d011b21537798fff8048adeb WatchSource:0}: Error finding container dca8055b0fc6b8abf1e85d3cf6ec4a70f9d3fa98d011b21537798fff8048adeb: Status 404 returned error can't find the container with id dca8055b0fc6b8abf1e85d3cf6ec4a70f9d3fa98d011b21537798fff8048adeb Sep 30 05:45:14 crc kubenswrapper[4956]: I0930 05:45:14.571649 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-588e-account-create-9zfn6"] Sep 30 05:45:14 crc kubenswrapper[4956]: W0930 05:45:14.578585 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cfef647_732a_43ff_ad7d_59b6f1eb1f2a.slice/crio-1aeb4a5c70ff8ff824f411af60ce2cd35133677a39e55de21669a3f184b3a2f7 WatchSource:0}: Error finding container 1aeb4a5c70ff8ff824f411af60ce2cd35133677a39e55de21669a3f184b3a2f7: Status 404 returned error can't find the container with id 1aeb4a5c70ff8ff824f411af60ce2cd35133677a39e55de21669a3f184b3a2f7 Sep 30 05:45:14 crc kubenswrapper[4956]: E0930 05:45:14.598356 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod826ffd10_134e_452b_93de_3a9bc469d44d.slice/crio-18af1d264db565f9694f2705f2a0d5db22f030f2f86aff76c63caaa9f8edecde.scope\": RecentStats: unable to find data in memory cache]" Sep 30 05:45:14 crc kubenswrapper[4956]: I0930 05:45:14.660528 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-24d8t" event={"ID":"de9dd57e-7978-4d30-b9ec-a3894e9d67e9","Type":"ContainerStarted","Data":"02008c1544ef92e2caf961213b92835f8fd219bfbdf2cceed525a7d78c6ac0cc"} Sep 30 05:45:14 crc kubenswrapper[4956]: I0930 05:45:14.662632 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" event={"ID":"bca37042-172f-42db-ac83-85a5872720df","Type":"ContainerStarted","Data":"dca8055b0fc6b8abf1e85d3cf6ec4a70f9d3fa98d011b21537798fff8048adeb"} Sep 30 05:45:14 crc kubenswrapper[4956]: I0930 05:45:14.665194 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-588e-account-create-9zfn6" event={"ID":"5cfef647-732a-43ff-ad7d-59b6f1eb1f2a","Type":"ContainerStarted","Data":"1aeb4a5c70ff8ff824f411af60ce2cd35133677a39e55de21669a3f184b3a2f7"} Sep 30 05:45:14 crc kubenswrapper[4956]: I0930 05:45:14.666724 4956 generic.go:334] "Generic (PLEG): container finished" podID="826ffd10-134e-452b-93de-3a9bc469d44d" containerID="18af1d264db565f9694f2705f2a0d5db22f030f2f86aff76c63caaa9f8edecde" exitCode=0 Sep 30 05:45:14 crc kubenswrapper[4956]: I0930 05:45:14.666756 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tkh7q" event={"ID":"826ffd10-134e-452b-93de-3a9bc469d44d","Type":"ContainerDied","Data":"18af1d264db565f9694f2705f2a0d5db22f030f2f86aff76c63caaa9f8edecde"} Sep 30 05:45:14 crc kubenswrapper[4956]: I0930 05:45:14.697195 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-24d8t" podStartSLOduration=2.229672555 podStartE2EDuration="11.697178537s" podCreationTimestamp="2025-09-30 05:45:03 +0000 UTC" firstStartedPulling="2025-09-30 05:45:04.585812873 +0000 UTC m=+974.912933398" lastFinishedPulling="2025-09-30 05:45:14.053318855 +0000 UTC m=+984.380439380" observedRunningTime="2025-09-30 05:45:14.678959245 +0000 UTC m=+985.006079770" watchObservedRunningTime="2025-09-30 05:45:14.697178537 +0000 UTC m=+985.024299072" Sep 30 05:45:15 crc kubenswrapper[4956]: I0930 05:45:15.676217 4956 generic.go:334] "Generic (PLEG): container finished" podID="bca37042-172f-42db-ac83-85a5872720df" containerID="980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5" exitCode=0 Sep 30 05:45:15 crc kubenswrapper[4956]: I0930 05:45:15.676326 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" event={"ID":"bca37042-172f-42db-ac83-85a5872720df","Type":"ContainerDied","Data":"980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5"} Sep 30 05:45:15 crc kubenswrapper[4956]: I0930 05:45:15.679842 4956 generic.go:334] "Generic (PLEG): container finished" podID="5cfef647-732a-43ff-ad7d-59b6f1eb1f2a" containerID="50fdd61af367195757809d530dd915594a5c82ae555ee93f1467646b253bcd38" exitCode=0 Sep 30 05:45:15 crc kubenswrapper[4956]: I0930 05:45:15.679895 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-588e-account-create-9zfn6" event={"ID":"5cfef647-732a-43ff-ad7d-59b6f1eb1f2a","Type":"ContainerDied","Data":"50fdd61af367195757809d530dd915594a5c82ae555ee93f1467646b253bcd38"} Sep 30 05:45:15 crc kubenswrapper[4956]: I0930 05:45:15.999125 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.131211 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-config-data\") pod \"826ffd10-134e-452b-93de-3a9bc469d44d\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.131385 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-combined-ca-bundle\") pod \"826ffd10-134e-452b-93de-3a9bc469d44d\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.131468 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqsz8\" (UniqueName: \"kubernetes.io/projected/826ffd10-134e-452b-93de-3a9bc469d44d-kube-api-access-qqsz8\") pod \"826ffd10-134e-452b-93de-3a9bc469d44d\" (UID: \"826ffd10-134e-452b-93de-3a9bc469d44d\") " Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.136345 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/826ffd10-134e-452b-93de-3a9bc469d44d-kube-api-access-qqsz8" (OuterVolumeSpecName: "kube-api-access-qqsz8") pod "826ffd10-134e-452b-93de-3a9bc469d44d" (UID: "826ffd10-134e-452b-93de-3a9bc469d44d"). InnerVolumeSpecName "kube-api-access-qqsz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.158835 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "826ffd10-134e-452b-93de-3a9bc469d44d" (UID: "826ffd10-134e-452b-93de-3a9bc469d44d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.198023 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-config-data" (OuterVolumeSpecName: "config-data") pod "826ffd10-134e-452b-93de-3a9bc469d44d" (UID: "826ffd10-134e-452b-93de-3a9bc469d44d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.232926 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.232965 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/826ffd10-134e-452b-93de-3a9bc469d44d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.232977 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqsz8\" (UniqueName: \"kubernetes.io/projected/826ffd10-134e-452b-93de-3a9bc469d44d-kube-api-access-qqsz8\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.690424 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tkh7q" event={"ID":"826ffd10-134e-452b-93de-3a9bc469d44d","Type":"ContainerDied","Data":"d9d100843c02f8f5cb33980511442549c074d44b242bbd8619bbe7e6e82424a4"} Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.690472 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9d100843c02f8f5cb33980511442549c074d44b242bbd8619bbe7e6e82424a4" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.690488 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tkh7q" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.693488 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" event={"ID":"bca37042-172f-42db-ac83-85a5872720df","Type":"ContainerStarted","Data":"60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21"} Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.715251 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" podStartSLOduration=6.715235565 podStartE2EDuration="6.715235565s" podCreationTimestamp="2025-09-30 05:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:16.713288633 +0000 UTC m=+987.040409178" watchObservedRunningTime="2025-09-30 05:45:16.715235565 +0000 UTC m=+987.042356090" Sep 30 05:45:16 crc kubenswrapper[4956]: I0930 05:45:16.971327 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dd98f6d57-fztws"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.031183 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ns6tm"] Sep 30 05:45:17 crc kubenswrapper[4956]: E0930 05:45:17.031700 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826ffd10-134e-452b-93de-3a9bc469d44d" containerName="keystone-db-sync" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.031718 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="826ffd10-134e-452b-93de-3a9bc469d44d" containerName="keystone-db-sync" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.031946 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="826ffd10-134e-452b-93de-3a9bc469d44d" containerName="keystone-db-sync" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.032750 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.042874 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.043046 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.043156 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n4lp7" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.043580 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ns6tm"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.048472 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.081855 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55d459d457-25v62"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.083303 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.113272 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d459d457-25v62"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.146983 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw8hf\" (UniqueName: \"kubernetes.io/projected/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-kube-api-access-rw8hf\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.147052 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-config-data\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.147074 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-scripts\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.147094 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-fernet-keys\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.147193 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-combined-ca-bundle\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.147212 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-credential-keys\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.191211 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6cf65b7d5c-99r5q"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.192617 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.200148 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.200379 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.200506 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-d4jgj" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.200609 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.215959 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cf65b7d5c-99r5q"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252300 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-config\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252353 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-combined-ca-bundle\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252376 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-credential-keys\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252409 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-sb\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252468 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw8hf\" (UniqueName: \"kubernetes.io/projected/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-kube-api-access-rw8hf\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252486 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-swift-storage-0\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252522 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-config-data\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252538 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-scripts\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252560 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgpzx\" (UniqueName: \"kubernetes.io/projected/1f59c98b-9a7e-4187-b60c-60761092be95-kube-api-access-bgpzx\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252578 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-fernet-keys\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252604 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-svc\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.252628 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-nb\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.270683 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-scripts\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.286346 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-credential-keys\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.286799 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-fernet-keys\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.287843 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-config-data\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.290461 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-combined-ca-bundle\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.309771 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw8hf\" (UniqueName: \"kubernetes.io/projected/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-kube-api-access-rw8hf\") pod \"keystone-bootstrap-ns6tm\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.345630 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.347901 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354272 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-config\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354336 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2e4921bb-4cc7-477b-b932-848d0c4d2a09-horizon-secret-key\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354369 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtc26\" (UniqueName: \"kubernetes.io/projected/2e4921bb-4cc7-477b-b932-848d0c4d2a09-kube-api-access-rtc26\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354433 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-sb\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354478 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-config-data\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354530 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-swift-storage-0\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354572 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-scripts\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354606 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgpzx\" (UniqueName: \"kubernetes.io/projected/1f59c98b-9a7e-4187-b60c-60761092be95-kube-api-access-bgpzx\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354644 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-svc\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354679 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-nb\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.354702 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e4921bb-4cc7-477b-b932-848d0c4d2a09-logs\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.355700 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-config\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.356380 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-sb\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.356971 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-swift-storage-0\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.358181 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-svc\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.358872 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-nb\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.361697 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.361994 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.362881 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.387277 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d459d457-25v62"] Sep 30 05:45:17 crc kubenswrapper[4956]: E0930 05:45:17.388091 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-bgpzx], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-55d459d457-25v62" podUID="1f59c98b-9a7e-4187-b60c-60761092be95" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.409824 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgpzx\" (UniqueName: \"kubernetes.io/projected/1f59c98b-9a7e-4187-b60c-60761092be95-kube-api-access-bgpzx\") pod \"dnsmasq-dns-55d459d457-25v62\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.420377 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.446940 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dqs9m"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.460978 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-log-httpd\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461274 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e4921bb-4cc7-477b-b932-848d0c4d2a09-logs\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461372 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjmh\" (UniqueName: \"kubernetes.io/projected/5f89721a-6d95-46fd-9a8f-d701ccda87b8-kube-api-access-bjjmh\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461484 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2e4921bb-4cc7-477b-b932-848d0c4d2a09-horizon-secret-key\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461550 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtc26\" (UniqueName: \"kubernetes.io/projected/2e4921bb-4cc7-477b-b932-848d0c4d2a09-kube-api-access-rtc26\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461641 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461736 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-scripts\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461797 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461877 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-config-data\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.461979 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-config-data\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.462094 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-scripts\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.462208 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-run-httpd\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.463083 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.462013 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e4921bb-4cc7-477b-b932-848d0c4d2a09-logs\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.463783 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-config-data\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.464319 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-scripts\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.470272 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pp5qx" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.470840 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.472507 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2e4921bb-4cc7-477b-b932-848d0c4d2a09-horizon-secret-key\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.475640 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.508097 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtc26\" (UniqueName: \"kubernetes.io/projected/2e4921bb-4cc7-477b-b932-848d0c4d2a09-kube-api-access-rtc26\") pod \"horizon-6cf65b7d5c-99r5q\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.508347 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dqs9m"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.516741 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.562585 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.564435 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568059 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-config-data\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568103 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-scripts\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568201 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-run-httpd\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568236 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-log-httpd\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568261 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-logs\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568296 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-config-data\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568317 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlvpm\" (UniqueName: \"kubernetes.io/projected/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-kube-api-access-rlvpm\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568359 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-combined-ca-bundle\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568378 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjmh\" (UniqueName: \"kubernetes.io/projected/5f89721a-6d95-46fd-9a8f-d701ccda87b8-kube-api-access-bjjmh\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568420 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568450 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-scripts\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.568464 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.570932 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mwhll" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.571238 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.571375 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.571541 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.584094 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-vmw7n"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.585522 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.610788 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-588e-account-create-9zfn6" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.613764 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-run-httpd\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.619498 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.623840 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.628906 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-log-httpd\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.629104 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-config-data\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.630075 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjmh\" (UniqueName: \"kubernetes.io/projected/5f89721a-6d95-46fd-9a8f-d701ccda87b8-kube-api-access-bjjmh\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.643817 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-scripts\") pod \"ceilometer-0\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.672602 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlmv5\" (UniqueName: \"kubernetes.io/projected/5cfef647-732a-43ff-ad7d-59b6f1eb1f2a-kube-api-access-vlmv5\") pod \"5cfef647-732a-43ff-ad7d-59b6f1eb1f2a\" (UID: \"5cfef647-732a-43ff-ad7d-59b6f1eb1f2a\") " Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.672904 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-svc\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.672939 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-logs\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.672967 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-config-data\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.672989 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcjv4\" (UniqueName: \"kubernetes.io/projected/e51333ef-f104-4d0d-a00b-76a2800a425a-kube-api-access-hcjv4\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673018 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlvpm\" (UniqueName: \"kubernetes.io/projected/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-kube-api-access-rlvpm\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673063 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-combined-ca-bundle\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673103 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673148 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673175 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673218 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-config-data\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j4bt\" (UniqueName: \"kubernetes.io/projected/b0f69bb4-0f8e-489f-9a47-06e84127473e-kube-api-access-4j4bt\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673321 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673346 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-scripts\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673364 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-scripts\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673386 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673409 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-config\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673429 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-swift-storage-0\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673454 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673485 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-logs\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.673892 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-logs\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.677797 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfef647-732a-43ff-ad7d-59b6f1eb1f2a-kube-api-access-vlmv5" (OuterVolumeSpecName: "kube-api-access-vlmv5") pod "5cfef647-732a-43ff-ad7d-59b6f1eb1f2a" (UID: "5cfef647-732a-43ff-ad7d-59b6f1eb1f2a"). InnerVolumeSpecName "kube-api-access-vlmv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.679237 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-config-data\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.680489 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-scripts\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.686607 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-combined-ca-bundle\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.690933 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dfd95655c-tvpq8"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.691336 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlvpm\" (UniqueName: \"kubernetes.io/projected/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-kube-api-access-rlvpm\") pod \"placement-db-sync-dqs9m\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: E0930 05:45:17.691415 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfef647-732a-43ff-ad7d-59b6f1eb1f2a" containerName="mariadb-account-create" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.691894 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfef647-732a-43ff-ad7d-59b6f1eb1f2a" containerName="mariadb-account-create" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.692203 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfef647-732a-43ff-ad7d-59b6f1eb1f2a" containerName="mariadb-account-create" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.693727 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.712625 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.726634 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-vmw7n"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.758256 4956 generic.go:334] "Generic (PLEG): container finished" podID="de9dd57e-7978-4d30-b9ec-a3894e9d67e9" containerID="02008c1544ef92e2caf961213b92835f8fd219bfbdf2cceed525a7d78c6ac0cc" exitCode=0 Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.758393 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-24d8t" event={"ID":"de9dd57e-7978-4d30-b9ec-a3894e9d67e9","Type":"ContainerDied","Data":"02008c1544ef92e2caf961213b92835f8fd219bfbdf2cceed525a7d78c6ac0cc"} Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.764479 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dfd95655c-tvpq8"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.766194 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.766976 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-588e-account-create-9zfn6" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.767202 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-588e-account-create-9zfn6" event={"ID":"5cfef647-732a-43ff-ad7d-59b6f1eb1f2a","Type":"ContainerDied","Data":"1aeb4a5c70ff8ff824f411af60ce2cd35133677a39e55de21669a3f184b3a2f7"} Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.767253 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aeb4a5c70ff8ff824f411af60ce2cd35133677a39e55de21669a3f184b3a2f7" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.779987 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.809643 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.815212 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25f900c-e486-4143-8ef9-0416286dc2dc-logs\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.815323 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-config\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.815349 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-swift-storage-0\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.815423 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.815524 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-logs\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.815551 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd9fj\" (UniqueName: \"kubernetes.io/projected/b25f900c-e486-4143-8ef9-0416286dc2dc-kube-api-access-hd9fj\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.817585 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-svc\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.817638 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b25f900c-e486-4143-8ef9-0416286dc2dc-horizon-secret-key\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.817714 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcjv4\" (UniqueName: \"kubernetes.io/projected/e51333ef-f104-4d0d-a00b-76a2800a425a-kube-api-access-hcjv4\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.817945 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818005 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818038 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818165 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-config-data\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818249 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j4bt\" (UniqueName: \"kubernetes.io/projected/b0f69bb4-0f8e-489f-9a47-06e84127473e-kube-api-access-4j4bt\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818414 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818547 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-scripts\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818889 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-scripts\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818943 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.818971 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-config-data\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.828059 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-swift-storage-0\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.828388 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.829489 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.835933 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.836020 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-logs\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.836240 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-nb\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.838014 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-svc\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.838663 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-sb\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.838811 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.838826 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-config\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.838336 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlmv5\" (UniqueName: \"kubernetes.io/projected/5cfef647-732a-43ff-ad7d-59b6f1eb1f2a-kube-api-access-vlmv5\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.867735 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.870605 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-scripts\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.871316 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-config-data\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.872461 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.874545 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j4bt\" (UniqueName: \"kubernetes.io/projected/b0f69bb4-0f8e-489f-9a47-06e84127473e-kube-api-access-4j4bt\") pod \"dnsmasq-dns-7cf77b4997-vmw7n\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.875671 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcjv4\" (UniqueName: \"kubernetes.io/projected/e51333ef-f104-4d0d-a00b-76a2800a425a-kube-api-access-hcjv4\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.886005 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.886206 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.905102 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.933742 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.943163 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-svc\") pod \"1f59c98b-9a7e-4187-b60c-60761092be95\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.943293 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-config\") pod \"1f59c98b-9a7e-4187-b60c-60761092be95\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.943427 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-swift-storage-0\") pod \"1f59c98b-9a7e-4187-b60c-60761092be95\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.943473 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-sb\") pod \"1f59c98b-9a7e-4187-b60c-60761092be95\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.943508 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgpzx\" (UniqueName: \"kubernetes.io/projected/1f59c98b-9a7e-4187-b60c-60761092be95-kube-api-access-bgpzx\") pod \"1f59c98b-9a7e-4187-b60c-60761092be95\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.943585 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.943720 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-nb\") pod \"1f59c98b-9a7e-4187-b60c-60761092be95\" (UID: \"1f59c98b-9a7e-4187-b60c-60761092be95\") " Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.944523 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-scripts\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.944576 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-config-data\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.944596 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25f900c-e486-4143-8ef9-0416286dc2dc-logs\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.944696 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd9fj\" (UniqueName: \"kubernetes.io/projected/b25f900c-e486-4143-8ef9-0416286dc2dc-kube-api-access-hd9fj\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.944744 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b25f900c-e486-4143-8ef9-0416286dc2dc-horizon-secret-key\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.944911 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-config" (OuterVolumeSpecName: "config") pod "1f59c98b-9a7e-4187-b60c-60761092be95" (UID: "1f59c98b-9a7e-4187-b60c-60761092be95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.945821 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f59c98b-9a7e-4187-b60c-60761092be95" (UID: "1f59c98b-9a7e-4187-b60c-60761092be95"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.946449 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f59c98b-9a7e-4187-b60c-60761092be95" (UID: "1f59c98b-9a7e-4187-b60c-60761092be95"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.948342 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f59c98b-9a7e-4187-b60c-60761092be95" (UID: "1f59c98b-9a7e-4187-b60c-60761092be95"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.949243 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-scripts\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.949643 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f59c98b-9a7e-4187-b60c-60761092be95" (UID: "1f59c98b-9a7e-4187-b60c-60761092be95"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.950037 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25f900c-e486-4143-8ef9-0416286dc2dc-logs\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.951206 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-config-data\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.960309 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.977835 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd9fj\" (UniqueName: \"kubernetes.io/projected/b25f900c-e486-4143-8ef9-0416286dc2dc-kube-api-access-hd9fj\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.979016 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.985251 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b25f900c-e486-4143-8ef9-0416286dc2dc-horizon-secret-key\") pod \"horizon-dfd95655c-tvpq8\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:17 crc kubenswrapper[4956]: I0930 05:45:17.985368 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f59c98b-9a7e-4187-b60c-60761092be95-kube-api-access-bgpzx" (OuterVolumeSpecName: "kube-api-access-bgpzx") pod "1f59c98b-9a7e-4187-b60c-60761092be95" (UID: "1f59c98b-9a7e-4187-b60c-60761092be95"). InnerVolumeSpecName "kube-api-access-bgpzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.024199 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.048312 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.048358 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.048382 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.048416 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.048504 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfrp\" (UniqueName: \"kubernetes.io/projected/7e014251-976b-4acd-ac9c-30b14ed29c6c-kube-api-access-whfrp\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.048636 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.048662 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.048688 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.049536 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.049554 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.049566 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.049578 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.049589 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgpzx\" (UniqueName: \"kubernetes.io/projected/1f59c98b-9a7e-4187-b60c-60761092be95-kube-api-access-bgpzx\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.049599 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f59c98b-9a7e-4187-b60c-60761092be95-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.125739 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ns6tm"] Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.151287 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.151707 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.152062 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whfrp\" (UniqueName: \"kubernetes.io/projected/7e014251-976b-4acd-ac9c-30b14ed29c6c-kube-api-access-whfrp\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.152103 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.152721 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.152754 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.152853 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.152876 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.152899 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.153293 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.154637 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.158779 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.162600 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.164398 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.171216 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: W0930 05:45:18.173358 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e7c7fc5_5510_4b6f_81e2_96f428be0a0b.slice/crio-d7880981caec4824461e7ef955b79d11f5260d9241f6db08fae6f4cfb42f8915 WatchSource:0}: Error finding container d7880981caec4824461e7ef955b79d11f5260d9241f6db08fae6f4cfb42f8915: Status 404 returned error can't find the container with id d7880981caec4824461e7ef955b79d11f5260d9241f6db08fae6f4cfb42f8915 Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.198524 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whfrp\" (UniqueName: \"kubernetes.io/projected/7e014251-976b-4acd-ac9c-30b14ed29c6c-kube-api-access-whfrp\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.210704 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.234726 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.257072 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.271049 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cf65b7d5c-99r5q"] Sep 30 05:45:18 crc kubenswrapper[4956]: W0930 05:45:18.367488 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e4921bb_4cc7_477b_b932_848d0c4d2a09.slice/crio-dbf497f0d4957cb23dbf081d00f51b2c023c85616f592c5c9b90218b062bcc71 WatchSource:0}: Error finding container dbf497f0d4957cb23dbf081d00f51b2c023c85616f592c5c9b90218b062bcc71: Status 404 returned error can't find the container with id dbf497f0d4957cb23dbf081d00f51b2c023c85616f592c5c9b90218b062bcc71 Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.777793 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-vmw7n"] Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.827630 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cf65b7d5c-99r5q" event={"ID":"2e4921bb-4cc7-477b-b932-848d0c4d2a09","Type":"ContainerStarted","Data":"dbf497f0d4957cb23dbf081d00f51b2c023c85616f592c5c9b90218b062bcc71"} Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.844470 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" podUID="bca37042-172f-42db-ac83-85a5872720df" containerName="dnsmasq-dns" containerID="cri-o://60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21" gracePeriod=10 Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.844761 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ns6tm" event={"ID":"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b","Type":"ContainerStarted","Data":"d7880981caec4824461e7ef955b79d11f5260d9241f6db08fae6f4cfb42f8915"} Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.844866 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d459d457-25v62" Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.852187 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.883635 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dfd95655c-tvpq8"] Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.907793 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dqs9m"] Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.965176 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d459d457-25v62"] Sep 30 05:45:18 crc kubenswrapper[4956]: I0930 05:45:18.974388 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55d459d457-25v62"] Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.333680 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:19 crc kubenswrapper[4956]: W0930 05:45:19.394438 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode51333ef_f104_4d0d_a00b_76a2800a425a.slice/crio-10361506f80e036f5020c0e8ca9ddf35792b448c5f6cad45a61b5f9a07988f7f WatchSource:0}: Error finding container 10361506f80e036f5020c0e8ca9ddf35792b448c5f6cad45a61b5f9a07988f7f: Status 404 returned error can't find the container with id 10361506f80e036f5020c0e8ca9ddf35792b448c5f6cad45a61b5f9a07988f7f Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.545478 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.627621 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722092 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-nb\") pod \"bca37042-172f-42db-ac83-85a5872720df\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722508 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-config\") pod \"bca37042-172f-42db-ac83-85a5872720df\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722552 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-db-sync-config-data\") pod \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722589 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-sb\") pod \"bca37042-172f-42db-ac83-85a5872720df\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722633 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg9w7\" (UniqueName: \"kubernetes.io/projected/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-kube-api-access-jg9w7\") pod \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722708 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcw85\" (UniqueName: \"kubernetes.io/projected/bca37042-172f-42db-ac83-85a5872720df-kube-api-access-fcw85\") pod \"bca37042-172f-42db-ac83-85a5872720df\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722792 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-combined-ca-bundle\") pod \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722850 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-svc\") pod \"bca37042-172f-42db-ac83-85a5872720df\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722869 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-swift-storage-0\") pod \"bca37042-172f-42db-ac83-85a5872720df\" (UID: \"bca37042-172f-42db-ac83-85a5872720df\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.722924 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-config-data\") pod \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\" (UID: \"de9dd57e-7978-4d30-b9ec-a3894e9d67e9\") " Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.745744 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca37042-172f-42db-ac83-85a5872720df-kube-api-access-fcw85" (OuterVolumeSpecName: "kube-api-access-fcw85") pod "bca37042-172f-42db-ac83-85a5872720df" (UID: "bca37042-172f-42db-ac83-85a5872720df"). InnerVolumeSpecName "kube-api-access-fcw85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.752376 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-kube-api-access-jg9w7" (OuterVolumeSpecName: "kube-api-access-jg9w7") pod "de9dd57e-7978-4d30-b9ec-a3894e9d67e9" (UID: "de9dd57e-7978-4d30-b9ec-a3894e9d67e9"). InnerVolumeSpecName "kube-api-access-jg9w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.775547 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.818510 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "de9dd57e-7978-4d30-b9ec-a3894e9d67e9" (UID: "de9dd57e-7978-4d30-b9ec-a3894e9d67e9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.825497 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.825529 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg9w7\" (UniqueName: \"kubernetes.io/projected/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-kube-api-access-jg9w7\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.825539 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcw85\" (UniqueName: \"kubernetes.io/projected/bca37042-172f-42db-ac83-85a5872720df-kube-api-access-fcw85\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.835234 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cf65b7d5c-99r5q"] Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.907868 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de9dd57e-7978-4d30-b9ec-a3894e9d67e9" (UID: "de9dd57e-7978-4d30-b9ec-a3894e9d67e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.908107 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dqs9m" event={"ID":"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5","Type":"ContainerStarted","Data":"46a63576fa1a218f0cbfeac73deded86a7623ef743500774d76c82a057d7c33b"} Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.908698 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7dcbfc4c57-z8gm9"] Sep 30 05:45:19 crc kubenswrapper[4956]: E0930 05:45:19.909273 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de9dd57e-7978-4d30-b9ec-a3894e9d67e9" containerName="watcher-db-sync" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.909291 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9dd57e-7978-4d30-b9ec-a3894e9d67e9" containerName="watcher-db-sync" Sep 30 05:45:19 crc kubenswrapper[4956]: E0930 05:45:19.909304 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca37042-172f-42db-ac83-85a5872720df" containerName="dnsmasq-dns" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.909311 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca37042-172f-42db-ac83-85a5872720df" containerName="dnsmasq-dns" Sep 30 05:45:19 crc kubenswrapper[4956]: E0930 05:45:19.909347 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca37042-172f-42db-ac83-85a5872720df" containerName="init" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.909353 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca37042-172f-42db-ac83-85a5872720df" containerName="init" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.909575 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="de9dd57e-7978-4d30-b9ec-a3894e9d67e9" containerName="watcher-db-sync" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.909596 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca37042-172f-42db-ac83-85a5872720df" containerName="dnsmasq-dns" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.915594 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.923181 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bca37042-172f-42db-ac83-85a5872720df" (UID: "bca37042-172f-42db-ac83-85a5872720df"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.929321 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.929357 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.929392 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.945048 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-24d8t" event={"ID":"de9dd57e-7978-4d30-b9ec-a3894e9d67e9","Type":"ContainerDied","Data":"0c39f4268725b531d4869c1a7d517bb6a2ab8d70f4285234770820df6e66e349"} Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.945105 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c39f4268725b531d4869c1a7d517bb6a2ab8d70f4285234770820df6e66e349" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.945229 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-24d8t" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.949603 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dcbfc4c57-z8gm9"] Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.962575 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.962568 4956 generic.go:334] "Generic (PLEG): container finished" podID="bca37042-172f-42db-ac83-85a5872720df" containerID="60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21" exitCode=0 Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.962645 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" event={"ID":"bca37042-172f-42db-ac83-85a5872720df","Type":"ContainerDied","Data":"60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21"} Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.962870 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" event={"ID":"bca37042-172f-42db-ac83-85a5872720df","Type":"ContainerDied","Data":"dca8055b0fc6b8abf1e85d3cf6ec4a70f9d3fa98d011b21537798fff8048adeb"} Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.962910 4956 scope.go:117] "RemoveContainer" containerID="60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21" Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.971780 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.979977 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e51333ef-f104-4d0d-a00b-76a2800a425a","Type":"ContainerStarted","Data":"10361506f80e036f5020c0e8ca9ddf35792b448c5f6cad45a61b5f9a07988f7f"} Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.981350 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f89721a-6d95-46fd-9a8f-d701ccda87b8","Type":"ContainerStarted","Data":"b14b7ec64210ac7f9e7bca12e2321c4072a34a021029e24f5487421e26d9caad"} Sep 30 05:45:19 crc kubenswrapper[4956]: I0930 05:45:19.987785 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dfd95655c-tvpq8" event={"ID":"b25f900c-e486-4143-8ef9-0416286dc2dc","Type":"ContainerStarted","Data":"549b9489b9bbd1a79bdd9e8278a48c3d1701339a39bc09cc9e90153c53228ea5"} Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.009468 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ns6tm" event={"ID":"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b","Type":"ContainerStarted","Data":"04b84c0b3e87464976e5c29c292c9d76380685b63d170678c4fb2aefa5553328"} Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.023468 4956 scope.go:117] "RemoveContainer" containerID="980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.031705 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55e72da-6cd5-48ac-b98b-ce39235f96f1-logs\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.031779 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-config-data\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.031846 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a55e72da-6cd5-48ac-b98b-ce39235f96f1-horizon-secret-key\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.031871 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgfrx\" (UniqueName: \"kubernetes.io/projected/a55e72da-6cd5-48ac-b98b-ce39235f96f1-kube-api-access-sgfrx\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.031892 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-scripts\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.049127 4956 generic.go:334] "Generic (PLEG): container finished" podID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerID="df6cc740622ec28a7a6ace160739af8fe6b68f41199c3a33a3a5364a046cf626" exitCode=0 Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.049188 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" event={"ID":"b0f69bb4-0f8e-489f-9a47-06e84127473e","Type":"ContainerDied","Data":"df6cc740622ec28a7a6ace160739af8fe6b68f41199c3a33a3a5364a046cf626"} Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.049227 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" event={"ID":"b0f69bb4-0f8e-489f-9a47-06e84127473e","Type":"ContainerStarted","Data":"df186e38457ba1be9277aa57d6493f4a82ede481f5dcf32a015f277caeac8e37"} Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.079747 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ns6tm" podStartSLOduration=4.07967821 podStartE2EDuration="4.07967821s" podCreationTimestamp="2025-09-30 05:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:20.046867609 +0000 UTC m=+990.373988134" watchObservedRunningTime="2025-09-30 05:45:20.07967821 +0000 UTC m=+990.406798725" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.100747 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bca37042-172f-42db-ac83-85a5872720df" (UID: "bca37042-172f-42db-ac83-85a5872720df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.124846 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-config" (OuterVolumeSpecName: "config") pod "bca37042-172f-42db-ac83-85a5872720df" (UID: "bca37042-172f-42db-ac83-85a5872720df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.133848 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55e72da-6cd5-48ac-b98b-ce39235f96f1-logs\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.134050 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-config-data\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.134753 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a55e72da-6cd5-48ac-b98b-ce39235f96f1-horizon-secret-key\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.134846 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgfrx\" (UniqueName: \"kubernetes.io/projected/a55e72da-6cd5-48ac-b98b-ce39235f96f1-kube-api-access-sgfrx\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.134924 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-scripts\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.135094 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.135187 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.135893 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-scripts\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.137584 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55e72da-6cd5-48ac-b98b-ce39235f96f1-logs\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.150059 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-config-data" (OuterVolumeSpecName: "config-data") pod "de9dd57e-7978-4d30-b9ec-a3894e9d67e9" (UID: "de9dd57e-7978-4d30-b9ec-a3894e9d67e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.153710 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-config-data\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.159522 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a55e72da-6cd5-48ac-b98b-ce39235f96f1-horizon-secret-key\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.166623 4956 scope.go:117] "RemoveContainer" containerID="60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21" Sep 30 05:45:20 crc kubenswrapper[4956]: E0930 05:45:20.168956 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21\": container with ID starting with 60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21 not found: ID does not exist" containerID="60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.169093 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21"} err="failed to get container status \"60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21\": rpc error: code = NotFound desc = could not find container \"60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21\": container with ID starting with 60f620fc5edddc115a66e264629285ad8b18244e72a0b49ec6a0f60244001e21 not found: ID does not exist" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.169387 4956 scope.go:117] "RemoveContainer" containerID="980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.169558 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.171548 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bca37042-172f-42db-ac83-85a5872720df" (UID: "bca37042-172f-42db-ac83-85a5872720df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.181321 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: E0930 05:45:20.184221 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5\": container with ID starting with 980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5 not found: ID does not exist" containerID="980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.184398 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5"} err="failed to get container status \"980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5\": rpc error: code = NotFound desc = could not find container \"980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5\": container with ID starting with 980217e4def293df458c7de97af548b7ee6931108d8e1921ec36bcdd033aa2c5 not found: ID does not exist" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.184697 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.192732 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgfrx\" (UniqueName: \"kubernetes.io/projected/a55e72da-6cd5-48ac-b98b-ce39235f96f1-kube-api-access-sgfrx\") pod \"horizon-7dcbfc4c57-z8gm9\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.197670 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.201951 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bca37042-172f-42db-ac83-85a5872720df" (UID: "bca37042-172f-42db-ac83-85a5872720df"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.221292 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.236503 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de9dd57e-7978-4d30-b9ec-a3894e9d67e9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.236530 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.236544 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bca37042-172f-42db-ac83-85a5872720df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.276449 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.279533 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.281220 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.290013 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.311218 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.332762 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.337297 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.338142 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.344935 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.344985 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.345037 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbl7k\" (UniqueName: \"kubernetes.io/projected/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-kube-api-access-zbl7k\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.345087 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-logs\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.345109 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.345762 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.391990 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f59c98b-9a7e-4187-b60c-60761092be95" path="/var/lib/kubelet/pods/1f59c98b-9a7e-4187-b60c-60761092be95/volumes" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.447384 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.447716 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48193dc0-3638-40e3-8b54-2b2049bd5925-config-data\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.447790 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd5wt\" (UniqueName: \"kubernetes.io/projected/48193dc0-3638-40e3-8b54-2b2049bd5925-kube-api-access-qd5wt\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.447871 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.448088 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbx7f\" (UniqueName: \"kubernetes.io/projected/7c50f741-28f6-4bb3-865e-fc21813c1b00-kube-api-access-nbx7f\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.448769 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.448846 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48193dc0-3638-40e3-8b54-2b2049bd5925-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.448912 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48193dc0-3638-40e3-8b54-2b2049bd5925-logs\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.448997 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.449068 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-config-data\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.449272 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.449342 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c50f741-28f6-4bb3-865e-fc21813c1b00-logs\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.449511 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbl7k\" (UniqueName: \"kubernetes.io/projected/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-kube-api-access-zbl7k\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.449641 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-logs\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.469671 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.473056 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-logs\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.474422 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.476439 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbl7k\" (UniqueName: \"kubernetes.io/projected/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-kube-api-access-zbl7k\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.496226 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aad83b1a-8ea2-4f43-a6e3-b8e844a65115-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aad83b1a-8ea2-4f43-a6e3-b8e844a65115\") " pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.527925 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.604130 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-config-data\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.604244 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.604283 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c50f741-28f6-4bb3-865e-fc21813c1b00-logs\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.604500 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48193dc0-3638-40e3-8b54-2b2049bd5925-config-data\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.604542 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd5wt\" (UniqueName: \"kubernetes.io/projected/48193dc0-3638-40e3-8b54-2b2049bd5925-kube-api-access-qd5wt\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.606339 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c50f741-28f6-4bb3-865e-fc21813c1b00-logs\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.609473 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.609524 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbx7f\" (UniqueName: \"kubernetes.io/projected/7c50f741-28f6-4bb3-865e-fc21813c1b00-kube-api-access-nbx7f\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.609597 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48193dc0-3638-40e3-8b54-2b2049bd5925-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.609621 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48193dc0-3638-40e3-8b54-2b2049bd5925-logs\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.610259 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48193dc0-3638-40e3-8b54-2b2049bd5925-logs\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.641673 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-config-data\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.641816 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48193dc0-3638-40e3-8b54-2b2049bd5925-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.642199 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48193dc0-3638-40e3-8b54-2b2049bd5925-config-data\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.642662 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.645365 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd5wt\" (UniqueName: \"kubernetes.io/projected/48193dc0-3638-40e3-8b54-2b2049bd5925-kube-api-access-qd5wt\") pod \"watcher-applier-0\" (UID: \"48193dc0-3638-40e3-8b54-2b2049bd5925\") " pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.646402 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.658785 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbx7f\" (UniqueName: \"kubernetes.io/projected/7c50f741-28f6-4bb3-865e-fc21813c1b00-kube-api-access-nbx7f\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.658933 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.901461 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 05:45:20 crc kubenswrapper[4956]: I0930 05:45:20.912096 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dcbfc4c57-z8gm9"] Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.066064 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e014251-976b-4acd-ac9c-30b14ed29c6c","Type":"ContainerStarted","Data":"fa3ae532d40f56a63b7d99aa8c34a1a33c1b1f87b46ca752802fd77ed16e4eaf"} Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.074099 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" event={"ID":"b0f69bb4-0f8e-489f-9a47-06e84127473e","Type":"ContainerStarted","Data":"d19aa099dc4e5112b74d9ddd6d508896e97a9418be1fc9d214a21c4c543234c7"} Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.074645 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.095594 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e51333ef-f104-4d0d-a00b-76a2800a425a","Type":"ContainerStarted","Data":"f290fda2699c3420e1a65192c4d5f290468f1f8fa05af961d2f7c175441173e3"} Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.098555 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dcbfc4c57-z8gm9" event={"ID":"a55e72da-6cd5-48ac-b98b-ce39235f96f1","Type":"ContainerStarted","Data":"d3bdcdc270a6f7b5c5a5ea04c9f6ac49f912ee4048311f7dcedfe85e609730ed"} Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.100862 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.110106 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" podStartSLOduration=4.110088736 podStartE2EDuration="4.110088736s" podCreationTimestamp="2025-09-30 05:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:21.102490307 +0000 UTC m=+991.429610832" watchObservedRunningTime="2025-09-30 05:45:21.110088736 +0000 UTC m=+991.437209261" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.165048 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a921-account-create-q4xck"] Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.168190 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a921-account-create-q4xck" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.190303 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a921-account-create-q4xck"] Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.192729 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.240695 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxjk\" (UniqueName: \"kubernetes.io/projected/8fbaf681-c7d0-4217-be5c-1a15e5e36786-kube-api-access-xwxjk\") pod \"cinder-a921-account-create-q4xck\" (UID: \"8fbaf681-c7d0-4217-be5c-1a15e5e36786\") " pod="openstack/cinder-a921-account-create-q4xck" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.283771 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3e76-account-create-gl5vp"] Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.285012 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3e76-account-create-gl5vp" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.294343 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3e76-account-create-gl5vp"] Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.298892 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.302265 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cd8q4"] Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.314326 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.320511 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vhp84" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.332440 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.342280 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cd8q4"] Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.345619 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p2hv\" (UniqueName: \"kubernetes.io/projected/d947526f-907a-4951-bbc4-51e29c560a06-kube-api-access-4p2hv\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.345666 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxjk\" (UniqueName: \"kubernetes.io/projected/8fbaf681-c7d0-4217-be5c-1a15e5e36786-kube-api-access-xwxjk\") pod \"cinder-a921-account-create-q4xck\" (UID: \"8fbaf681-c7d0-4217-be5c-1a15e5e36786\") " pod="openstack/cinder-a921-account-create-q4xck" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.345715 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqf9\" (UniqueName: \"kubernetes.io/projected/6488fac1-2ea4-4c60-bb3c-8014ec1e8149-kube-api-access-xkqf9\") pod \"neutron-3e76-account-create-gl5vp\" (UID: \"6488fac1-2ea4-4c60-bb3c-8014ec1e8149\") " pod="openstack/neutron-3e76-account-create-gl5vp" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.345765 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-combined-ca-bundle\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.345834 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-db-sync-config-data\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.374582 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxjk\" (UniqueName: \"kubernetes.io/projected/8fbaf681-c7d0-4217-be5c-1a15e5e36786-kube-api-access-xwxjk\") pod \"cinder-a921-account-create-q4xck\" (UID: \"8fbaf681-c7d0-4217-be5c-1a15e5e36786\") " pod="openstack/cinder-a921-account-create-q4xck" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.379813 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 05:45:21 crc kubenswrapper[4956]: W0930 05:45:21.389737 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48193dc0_3638_40e3_8b54_2b2049bd5925.slice/crio-d639696479593bbf632a03eabb00f1899b5d5ec8e7d1cf6cd43b8b434e7efd1b WatchSource:0}: Error finding container d639696479593bbf632a03eabb00f1899b5d5ec8e7d1cf6cd43b8b434e7efd1b: Status 404 returned error can't find the container with id d639696479593bbf632a03eabb00f1899b5d5ec8e7d1cf6cd43b8b434e7efd1b Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.462597 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqf9\" (UniqueName: \"kubernetes.io/projected/6488fac1-2ea4-4c60-bb3c-8014ec1e8149-kube-api-access-xkqf9\") pod \"neutron-3e76-account-create-gl5vp\" (UID: \"6488fac1-2ea4-4c60-bb3c-8014ec1e8149\") " pod="openstack/neutron-3e76-account-create-gl5vp" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.462735 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-combined-ca-bundle\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.462927 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-db-sync-config-data\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.462997 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p2hv\" (UniqueName: \"kubernetes.io/projected/d947526f-907a-4951-bbc4-51e29c560a06-kube-api-access-4p2hv\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.472898 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-db-sync-config-data\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.481468 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-combined-ca-bundle\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.489805 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p2hv\" (UniqueName: \"kubernetes.io/projected/d947526f-907a-4951-bbc4-51e29c560a06-kube-api-access-4p2hv\") pod \"barbican-db-sync-cd8q4\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.492319 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqf9\" (UniqueName: \"kubernetes.io/projected/6488fac1-2ea4-4c60-bb3c-8014ec1e8149-kube-api-access-xkqf9\") pod \"neutron-3e76-account-create-gl5vp\" (UID: \"6488fac1-2ea4-4c60-bb3c-8014ec1e8149\") " pod="openstack/neutron-3e76-account-create-gl5vp" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.526959 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a921-account-create-q4xck" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.543913 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:21 crc kubenswrapper[4956]: W0930 05:45:21.576954 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c50f741_28f6_4bb3_865e_fc21813c1b00.slice/crio-d438b85792f9e7a9e2c5ab07a5886697d00f5bd0104894042ecdcfe119080d6d WatchSource:0}: Error finding container d438b85792f9e7a9e2c5ab07a5886697d00f5bd0104894042ecdcfe119080d6d: Status 404 returned error can't find the container with id d438b85792f9e7a9e2c5ab07a5886697d00f5bd0104894042ecdcfe119080d6d Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.653788 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3e76-account-create-gl5vp" Sep 30 05:45:21 crc kubenswrapper[4956]: I0930 05:45:21.667418 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:45:22 crc kubenswrapper[4956]: I0930 05:45:22.144206 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e014251-976b-4acd-ac9c-30b14ed29c6c","Type":"ContainerStarted","Data":"3e1f243e230666e6cbb921173fd66b7c96619ede87f4641aa5e56ac9001fd8c7"} Sep 30 05:45:22 crc kubenswrapper[4956]: I0930 05:45:22.147239 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c50f741-28f6-4bb3-865e-fc21813c1b00","Type":"ContainerStarted","Data":"e830732805e896396d587045b8c84fa19caae122664496e4b9cc7b7e02f7b189"} Sep 30 05:45:22 crc kubenswrapper[4956]: I0930 05:45:22.147265 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c50f741-28f6-4bb3-865e-fc21813c1b00","Type":"ContainerStarted","Data":"d438b85792f9e7a9e2c5ab07a5886697d00f5bd0104894042ecdcfe119080d6d"} Sep 30 05:45:22 crc kubenswrapper[4956]: I0930 05:45:22.150373 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"48193dc0-3638-40e3-8b54-2b2049bd5925","Type":"ContainerStarted","Data":"d639696479593bbf632a03eabb00f1899b5d5ec8e7d1cf6cd43b8b434e7efd1b"} Sep 30 05:45:22 crc kubenswrapper[4956]: I0930 05:45:22.162108 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aad83b1a-8ea2-4f43-a6e3-b8e844a65115","Type":"ContainerStarted","Data":"53cd4314cb975f59df3d869ba72ca0c28f5150a6f837ceb05ae3b8135175821d"} Sep 30 05:45:22 crc kubenswrapper[4956]: I0930 05:45:22.205350 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a921-account-create-q4xck"] Sep 30 05:45:22 crc kubenswrapper[4956]: W0930 05:45:22.358302 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd947526f_907a_4951_bbc4_51e29c560a06.slice/crio-7a6b62a66a44cb3ad97d416b1cf493838df5729109a80aa88ac0d7304c7598b1 WatchSource:0}: Error finding container 7a6b62a66a44cb3ad97d416b1cf493838df5729109a80aa88ac0d7304c7598b1: Status 404 returned error can't find the container with id 7a6b62a66a44cb3ad97d416b1cf493838df5729109a80aa88ac0d7304c7598b1 Sep 30 05:45:22 crc kubenswrapper[4956]: I0930 05:45:22.363545 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cd8q4"] Sep 30 05:45:22 crc kubenswrapper[4956]: I0930 05:45:22.535792 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3e76-account-create-gl5vp"] Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.194419 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e014251-976b-4acd-ac9c-30b14ed29c6c","Type":"ContainerStarted","Data":"64c657c67da3d57af3ab43ba2cca01b58dbfb7c3a32ad06f048fafe65e2880f9"} Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.194631 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerName="glance-log" containerID="cri-o://3e1f243e230666e6cbb921173fd66b7c96619ede87f4641aa5e56ac9001fd8c7" gracePeriod=30 Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.194929 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerName="glance-httpd" containerID="cri-o://64c657c67da3d57af3ab43ba2cca01b58dbfb7c3a32ad06f048fafe65e2880f9" gracePeriod=30 Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.205047 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cd8q4" event={"ID":"d947526f-907a-4951-bbc4-51e29c560a06","Type":"ContainerStarted","Data":"7a6b62a66a44cb3ad97d416b1cf493838df5729109a80aa88ac0d7304c7598b1"} Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.228377 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c50f741-28f6-4bb3-865e-fc21813c1b00","Type":"ContainerStarted","Data":"fdb70915fb7c98a304429f380940af88934d57f1b6e20dc050936830e371b909"} Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.228738 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.235390 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3e76-account-create-gl5vp" event={"ID":"6488fac1-2ea4-4c60-bb3c-8014ec1e8149","Type":"ContainerStarted","Data":"81798e0d8072235339857c798c8b6e1b6fafc5fd4d1819afa49e16da10d22f6c"} Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.235449 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3e76-account-create-gl5vp" event={"ID":"6488fac1-2ea4-4c60-bb3c-8014ec1e8149","Type":"ContainerStarted","Data":"d35528b7bd446c18732017b452478b74bc4154998496ad43d386b1842d3a13bb"} Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.239013 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e51333ef-f104-4d0d-a00b-76a2800a425a","Type":"ContainerStarted","Data":"b68193af02d30c7f87ae81cf906482510126044a33388b576fe65bfad61a28c3"} Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.239218 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerName="glance-log" containerID="cri-o://f290fda2699c3420e1a65192c4d5f290468f1f8fa05af961d2f7c175441173e3" gracePeriod=30 Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.239525 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerName="glance-httpd" containerID="cri-o://b68193af02d30c7f87ae81cf906482510126044a33388b576fe65bfad61a28c3" gracePeriod=30 Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.241727 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.241705263 podStartE2EDuration="6.241705263s" podCreationTimestamp="2025-09-30 05:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:23.228028903 +0000 UTC m=+993.555149438" watchObservedRunningTime="2025-09-30 05:45:23.241705263 +0000 UTC m=+993.568825788" Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.256530 4956 generic.go:334] "Generic (PLEG): container finished" podID="8fbaf681-c7d0-4217-be5c-1a15e5e36786" containerID="a19ec9f1292a11a446422a9d1f7a9f37771e9154f9e3722a35d8537ac0a6a2d1" exitCode=0 Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.256589 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a921-account-create-q4xck" event={"ID":"8fbaf681-c7d0-4217-be5c-1a15e5e36786","Type":"ContainerDied","Data":"a19ec9f1292a11a446422a9d1f7a9f37771e9154f9e3722a35d8537ac0a6a2d1"} Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.256619 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a921-account-create-q4xck" event={"ID":"8fbaf681-c7d0-4217-be5c-1a15e5e36786","Type":"ContainerStarted","Data":"565a46eecd1550b0b5db1009f56f36ed4283ecd82fc666a3da283dd0bd9aba21"} Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.268160 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.268137794 podStartE2EDuration="3.268137794s" podCreationTimestamp="2025-09-30 05:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:23.263777707 +0000 UTC m=+993.590898232" watchObservedRunningTime="2025-09-30 05:45:23.268137794 +0000 UTC m=+993.595258319" Sep 30 05:45:23 crc kubenswrapper[4956]: I0930 05:45:23.291763 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.291746936 podStartE2EDuration="6.291746936s" podCreationTimestamp="2025-09-30 05:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:23.288470373 +0000 UTC m=+993.615590898" watchObservedRunningTime="2025-09-30 05:45:23.291746936 +0000 UTC m=+993.618867461" Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.268818 4956 generic.go:334] "Generic (PLEG): container finished" podID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerID="b68193af02d30c7f87ae81cf906482510126044a33388b576fe65bfad61a28c3" exitCode=0 Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.269077 4956 generic.go:334] "Generic (PLEG): container finished" podID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerID="f290fda2699c3420e1a65192c4d5f290468f1f8fa05af961d2f7c175441173e3" exitCode=143 Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.269139 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e51333ef-f104-4d0d-a00b-76a2800a425a","Type":"ContainerDied","Data":"b68193af02d30c7f87ae81cf906482510126044a33388b576fe65bfad61a28c3"} Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.269167 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e51333ef-f104-4d0d-a00b-76a2800a425a","Type":"ContainerDied","Data":"f290fda2699c3420e1a65192c4d5f290468f1f8fa05af961d2f7c175441173e3"} Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.272680 4956 generic.go:334] "Generic (PLEG): container finished" podID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerID="64c657c67da3d57af3ab43ba2cca01b58dbfb7c3a32ad06f048fafe65e2880f9" exitCode=143 Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.272720 4956 generic.go:334] "Generic (PLEG): container finished" podID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerID="3e1f243e230666e6cbb921173fd66b7c96619ede87f4641aa5e56ac9001fd8c7" exitCode=143 Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.272726 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e014251-976b-4acd-ac9c-30b14ed29c6c","Type":"ContainerDied","Data":"64c657c67da3d57af3ab43ba2cca01b58dbfb7c3a32ad06f048fafe65e2880f9"} Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.272763 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e014251-976b-4acd-ac9c-30b14ed29c6c","Type":"ContainerDied","Data":"3e1f243e230666e6cbb921173fd66b7c96619ede87f4641aa5e56ac9001fd8c7"} Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.275186 4956 generic.go:334] "Generic (PLEG): container finished" podID="6488fac1-2ea4-4c60-bb3c-8014ec1e8149" containerID="81798e0d8072235339857c798c8b6e1b6fafc5fd4d1819afa49e16da10d22f6c" exitCode=0 Sep 30 05:45:24 crc kubenswrapper[4956]: I0930 05:45:24.276220 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3e76-account-create-gl5vp" event={"ID":"6488fac1-2ea4-4c60-bb3c-8014ec1e8149","Type":"ContainerDied","Data":"81798e0d8072235339857c798c8b6e1b6fafc5fd4d1819afa49e16da10d22f6c"} Sep 30 05:45:25 crc kubenswrapper[4956]: I0930 05:45:25.300272 4956 generic.go:334] "Generic (PLEG): container finished" podID="1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" containerID="04b84c0b3e87464976e5c29c292c9d76380685b63d170678c4fb2aefa5553328" exitCode=0 Sep 30 05:45:25 crc kubenswrapper[4956]: I0930 05:45:25.300459 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ns6tm" event={"ID":"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b","Type":"ContainerDied","Data":"04b84c0b3e87464976e5c29c292c9d76380685b63d170678c4fb2aefa5553328"} Sep 30 05:45:25 crc kubenswrapper[4956]: I0930 05:45:25.837217 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 05:45:25 crc kubenswrapper[4956]: I0930 05:45:25.902924 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.409930 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dfd95655c-tvpq8"] Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.450551 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68c96b554d-8mtnt"] Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.452622 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.454758 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.461611 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68c96b554d-8mtnt"] Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.524256 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dcbfc4c57-z8gm9"] Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.525457 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-combined-ca-bundle\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.525575 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jthms\" (UniqueName: \"kubernetes.io/projected/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-kube-api-access-jthms\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.525685 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-tls-certs\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.525771 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-scripts\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.525858 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-logs\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.525937 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-secret-key\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.526017 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-config-data\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.550887 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f96b888bb-bhtl9"] Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.552673 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.561937 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f96b888bb-bhtl9"] Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627464 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-tls-certs\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627504 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-combined-ca-bundle\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627543 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29ac7f2-13b9-47d8-9218-fb08840e6704-logs\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627563 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-scripts\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627578 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-logs\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627599 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-secret-key\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627627 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-config-data\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627642 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-horizon-secret-key\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627660 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fsvr\" (UniqueName: \"kubernetes.io/projected/f29ac7f2-13b9-47d8-9218-fb08840e6704-kube-api-access-2fsvr\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627679 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f29ac7f2-13b9-47d8-9218-fb08840e6704-scripts\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627742 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-horizon-tls-certs\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627760 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-combined-ca-bundle\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627777 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f29ac7f2-13b9-47d8-9218-fb08840e6704-config-data\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.627793 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jthms\" (UniqueName: \"kubernetes.io/projected/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-kube-api-access-jthms\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.628916 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-scripts\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.629234 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-logs\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.631396 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-config-data\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.639658 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-secret-key\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.640028 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-combined-ca-bundle\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.650951 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jthms\" (UniqueName: \"kubernetes.io/projected/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-kube-api-access-jthms\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.667218 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-tls-certs\") pod \"horizon-68c96b554d-8mtnt\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.729038 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-horizon-tls-certs\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.729412 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f29ac7f2-13b9-47d8-9218-fb08840e6704-config-data\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.729508 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-combined-ca-bundle\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.729562 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29ac7f2-13b9-47d8-9218-fb08840e6704-logs\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.729624 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-horizon-secret-key\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.729646 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fsvr\" (UniqueName: \"kubernetes.io/projected/f29ac7f2-13b9-47d8-9218-fb08840e6704-kube-api-access-2fsvr\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.729667 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f29ac7f2-13b9-47d8-9218-fb08840e6704-scripts\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.730567 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f29ac7f2-13b9-47d8-9218-fb08840e6704-scripts\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.731437 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29ac7f2-13b9-47d8-9218-fb08840e6704-logs\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.732473 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f29ac7f2-13b9-47d8-9218-fb08840e6704-config-data\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.734826 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-horizon-secret-key\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.735544 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-horizon-tls-certs\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.739106 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29ac7f2-13b9-47d8-9218-fb08840e6704-combined-ca-bundle\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.756576 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fsvr\" (UniqueName: \"kubernetes.io/projected/f29ac7f2-13b9-47d8-9218-fb08840e6704-kube-api-access-2fsvr\") pod \"horizon-5f96b888bb-bhtl9\" (UID: \"f29ac7f2-13b9-47d8-9218-fb08840e6704\") " pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.783750 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:26 crc kubenswrapper[4956]: I0930 05:45:26.886826 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:27 crc kubenswrapper[4956]: I0930 05:45:27.985752 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:45:28 crc kubenswrapper[4956]: I0930 05:45:28.052918 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-hwpd5"] Sep 30 05:45:28 crc kubenswrapper[4956]: I0930 05:45:28.053187 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerName="dnsmasq-dns" containerID="cri-o://02a32a5345e4055e251f2625d2879b5c43b7bad65df838c3cf1eb9a3f53a1c82" gracePeriod=10 Sep 30 05:45:28 crc kubenswrapper[4956]: I0930 05:45:28.339540 4956 generic.go:334] "Generic (PLEG): container finished" podID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerID="02a32a5345e4055e251f2625d2879b5c43b7bad65df838c3cf1eb9a3f53a1c82" exitCode=0 Sep 30 05:45:28 crc kubenswrapper[4956]: I0930 05:45:28.339630 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" event={"ID":"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1","Type":"ContainerDied","Data":"02a32a5345e4055e251f2625d2879b5c43b7bad65df838c3cf1eb9a3f53a1c82"} Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.153800 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a921-account-create-q4xck" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.160729 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3e76-account-create-gl5vp" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.193649 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.307109 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkqf9\" (UniqueName: \"kubernetes.io/projected/6488fac1-2ea4-4c60-bb3c-8014ec1e8149-kube-api-access-xkqf9\") pod \"6488fac1-2ea4-4c60-bb3c-8014ec1e8149\" (UID: \"6488fac1-2ea4-4c60-bb3c-8014ec1e8149\") " Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.307302 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-credential-keys\") pod \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.307599 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwxjk\" (UniqueName: \"kubernetes.io/projected/8fbaf681-c7d0-4217-be5c-1a15e5e36786-kube-api-access-xwxjk\") pod \"8fbaf681-c7d0-4217-be5c-1a15e5e36786\" (UID: \"8fbaf681-c7d0-4217-be5c-1a15e5e36786\") " Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.307673 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-scripts\") pod \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.307691 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-fernet-keys\") pod \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.307706 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-combined-ca-bundle\") pod \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.307787 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw8hf\" (UniqueName: \"kubernetes.io/projected/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-kube-api-access-rw8hf\") pod \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.307815 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-config-data\") pod \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\" (UID: \"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b\") " Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.315262 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbaf681-c7d0-4217-be5c-1a15e5e36786-kube-api-access-xwxjk" (OuterVolumeSpecName: "kube-api-access-xwxjk") pod "8fbaf681-c7d0-4217-be5c-1a15e5e36786" (UID: "8fbaf681-c7d0-4217-be5c-1a15e5e36786"). InnerVolumeSpecName "kube-api-access-xwxjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.315929 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6488fac1-2ea4-4c60-bb3c-8014ec1e8149-kube-api-access-xkqf9" (OuterVolumeSpecName: "kube-api-access-xkqf9") pod "6488fac1-2ea4-4c60-bb3c-8014ec1e8149" (UID: "6488fac1-2ea4-4c60-bb3c-8014ec1e8149"). InnerVolumeSpecName "kube-api-access-xkqf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.316763 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-scripts" (OuterVolumeSpecName: "scripts") pod "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" (UID: "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.318619 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" (UID: "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.318938 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-kube-api-access-rw8hf" (OuterVolumeSpecName: "kube-api-access-rw8hf") pod "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" (UID: "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b"). InnerVolumeSpecName "kube-api-access-rw8hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.322341 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" (UID: "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.339649 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-config-data" (OuterVolumeSpecName: "config-data") pod "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" (UID: "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.354401 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" (UID: "1e7c7fc5-5510-4b6f-81e2-96f428be0a0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.386499 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a921-account-create-q4xck" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.386568 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a921-account-create-q4xck" event={"ID":"8fbaf681-c7d0-4217-be5c-1a15e5e36786","Type":"ContainerDied","Data":"565a46eecd1550b0b5db1009f56f36ed4283ecd82fc666a3da283dd0bd9aba21"} Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.386636 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="565a46eecd1550b0b5db1009f56f36ed4283ecd82fc666a3da283dd0bd9aba21" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.400777 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ns6tm" event={"ID":"1e7c7fc5-5510-4b6f-81e2-96f428be0a0b","Type":"ContainerDied","Data":"d7880981caec4824461e7ef955b79d11f5260d9241f6db08fae6f4cfb42f8915"} Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.400813 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7880981caec4824461e7ef955b79d11f5260d9241f6db08fae6f4cfb42f8915" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.400866 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ns6tm" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.412445 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3e76-account-create-gl5vp" event={"ID":"6488fac1-2ea4-4c60-bb3c-8014ec1e8149","Type":"ContainerDied","Data":"d35528b7bd446c18732017b452478b74bc4154998496ad43d386b1842d3a13bb"} Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.412515 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d35528b7bd446c18732017b452478b74bc4154998496ad43d386b1842d3a13bb" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.412591 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3e76-account-create-gl5vp" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.421175 4956 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.421205 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwxjk\" (UniqueName: \"kubernetes.io/projected/8fbaf681-c7d0-4217-be5c-1a15e5e36786-kube-api-access-xwxjk\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.421218 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.421226 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.421234 4956 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.421243 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw8hf\" (UniqueName: \"kubernetes.io/projected/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-kube-api-access-rw8hf\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.421251 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.421259 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkqf9\" (UniqueName: \"kubernetes.io/projected/6488fac1-2ea4-4c60-bb3c-8014ec1e8149-kube-api-access-xkqf9\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.901761 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 05:45:30 crc kubenswrapper[4956]: I0930 05:45:30.905064 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.282259 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ns6tm"] Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.289604 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ns6tm"] Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.388628 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t7llp"] Sep 30 05:45:31 crc kubenswrapper[4956]: E0930 05:45:31.389303 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" containerName="keystone-bootstrap" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.389329 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" containerName="keystone-bootstrap" Sep 30 05:45:31 crc kubenswrapper[4956]: E0930 05:45:31.389383 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6488fac1-2ea4-4c60-bb3c-8014ec1e8149" containerName="mariadb-account-create" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.389392 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6488fac1-2ea4-4c60-bb3c-8014ec1e8149" containerName="mariadb-account-create" Sep 30 05:45:31 crc kubenswrapper[4956]: E0930 05:45:31.389416 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbaf681-c7d0-4217-be5c-1a15e5e36786" containerName="mariadb-account-create" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.389423 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbaf681-c7d0-4217-be5c-1a15e5e36786" containerName="mariadb-account-create" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.389752 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" containerName="keystone-bootstrap" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.389770 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbaf681-c7d0-4217-be5c-1a15e5e36786" containerName="mariadb-account-create" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.389799 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6488fac1-2ea4-4c60-bb3c-8014ec1e8149" containerName="mariadb-account-create" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.391014 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.394357 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.394735 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.394921 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.395955 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n4lp7" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.397656 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t7llp"] Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.425231 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.480665 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-credential-keys\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.480736 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-combined-ca-bundle\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.480851 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-fernet-keys\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.480955 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-scripts\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.481006 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh5l5\" (UniqueName: \"kubernetes.io/projected/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-kube-api-access-jh5l5\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.481157 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-config-data\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.530058 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hvzfv"] Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.531782 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.534199 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.534898 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2xwn5" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.536200 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.541649 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hvzfv"] Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.583828 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-credential-keys\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.583883 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-combined-ca-bundle\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.583913 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-config\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.583960 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-fernet-keys\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.583977 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-combined-ca-bundle\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.584025 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-scripts\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.584052 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh5l5\" (UniqueName: \"kubernetes.io/projected/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-kube-api-access-jh5l5\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.584096 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-config-data\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.584130 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8hf\" (UniqueName: \"kubernetes.io/projected/d7d95941-d95f-4302-84f1-9230a7b9001f-kube-api-access-fx8hf\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.591179 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-combined-ca-bundle\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.593063 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-config-data\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.598977 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-fernet-keys\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.602345 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-credential-keys\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.602710 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh5l5\" (UniqueName: \"kubernetes.io/projected/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-kube-api-access-jh5l5\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.617498 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-scripts\") pod \"keystone-bootstrap-t7llp\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.685454 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8hf\" (UniqueName: \"kubernetes.io/projected/d7d95941-d95f-4302-84f1-9230a7b9001f-kube-api-access-fx8hf\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.685545 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-config\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.685580 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-combined-ca-bundle\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.689907 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-config\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.691632 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-combined-ca-bundle\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.703457 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8hf\" (UniqueName: \"kubernetes.io/projected/d7d95941-d95f-4302-84f1-9230a7b9001f-kube-api-access-fx8hf\") pod \"neutron-db-sync-hvzfv\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.718670 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:31 crc kubenswrapper[4956]: I0930 05:45:31.854717 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:45:32 crc kubenswrapper[4956]: I0930 05:45:32.352344 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e7c7fc5-5510-4b6f-81e2-96f428be0a0b" path="/var/lib/kubelet/pods/1e7c7fc5-5510-4b6f-81e2-96f428be0a0b/volumes" Sep 30 05:45:34 crc kubenswrapper[4956]: I0930 05:45:34.173465 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:34 crc kubenswrapper[4956]: I0930 05:45:34.176793 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api-log" containerID="cri-o://e830732805e896396d587045b8c84fa19caae122664496e4b9cc7b7e02f7b189" gracePeriod=30 Sep 30 05:45:34 crc kubenswrapper[4956]: I0930 05:45:34.177212 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api" containerID="cri-o://fdb70915fb7c98a304429f380940af88934d57f1b6e20dc050936830e371b909" gracePeriod=30 Sep 30 05:45:34 crc kubenswrapper[4956]: I0930 05:45:34.453509 4956 generic.go:334] "Generic (PLEG): container finished" podID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerID="e830732805e896396d587045b8c84fa19caae122664496e4b9cc7b7e02f7b189" exitCode=143 Sep 30 05:45:34 crc kubenswrapper[4956]: I0930 05:45:34.453555 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c50f741-28f6-4bb3-865e-fc21813c1b00","Type":"ContainerDied","Data":"e830732805e896396d587045b8c84fa19caae122664496e4b9cc7b7e02f7b189"} Sep 30 05:45:35 crc kubenswrapper[4956]: I0930 05:45:35.466870 4956 generic.go:334] "Generic (PLEG): container finished" podID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerID="fdb70915fb7c98a304429f380940af88934d57f1b6e20dc050936830e371b909" exitCode=0 Sep 30 05:45:35 crc kubenswrapper[4956]: I0930 05:45:35.467073 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c50f741-28f6-4bb3-865e-fc21813c1b00","Type":"ContainerDied","Data":"fdb70915fb7c98a304429f380940af88934d57f1b6e20dc050936830e371b909"} Sep 30 05:45:35 crc kubenswrapper[4956]: I0930 05:45:35.902709 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": dial tcp 10.217.0.154:9322: connect: connection refused" Sep 30 05:45:35 crc kubenswrapper[4956]: I0930 05:45:35.902713 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": dial tcp 10.217.0.154:9322: connect: connection refused" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.392906 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nnr6k"] Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.394893 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.397563 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.397719 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9tmwg" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.397854 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.405230 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nnr6k"] Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.490902 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/740de0e5-c3e9-43bc-bb01-8c240f50070e-etc-machine-id\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.490981 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-combined-ca-bundle\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.491188 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-scripts\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.491249 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xzz\" (UniqueName: \"kubernetes.io/projected/740de0e5-c3e9-43bc-bb01-8c240f50070e-kube-api-access-f8xzz\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.491385 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-config-data\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.491530 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-db-sync-config-data\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.593396 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-config-data\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.593535 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-db-sync-config-data\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.593589 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/740de0e5-c3e9-43bc-bb01-8c240f50070e-etc-machine-id\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.593621 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-combined-ca-bundle\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.593679 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-scripts\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.593702 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8xzz\" (UniqueName: \"kubernetes.io/projected/740de0e5-c3e9-43bc-bb01-8c240f50070e-kube-api-access-f8xzz\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.593754 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/740de0e5-c3e9-43bc-bb01-8c240f50070e-etc-machine-id\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.598688 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-db-sync-config-data\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.600792 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-combined-ca-bundle\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.609554 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-scripts\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.610505 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-config-data\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.611432 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8xzz\" (UniqueName: \"kubernetes.io/projected/740de0e5-c3e9-43bc-bb01-8c240f50070e-kube-api-access-f8xzz\") pod \"cinder-db-sync-nnr6k\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.745717 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:45:36 crc kubenswrapper[4956]: I0930 05:45:36.937765 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.008060 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.008107 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.008251 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-master-centos10/openstack-horizon:current,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57bh579h65dh654h5d5h654h544hb8h4h58bh589hdfh55dh587hdbh5b8h646h57dhc6h598h57dh65h55bh54hb6h556h654h654h5f6h684h647h677q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgfrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7dcbfc4c57-z8gm9_openstack(a55e72da-6cd5-48ac-b98b-ce39235f96f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.010156 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-horizon:current\\\"\"]" pod="openstack/horizon-7dcbfc4c57-z8gm9" podUID="a55e72da-6cd5-48ac-b98b-ce39235f96f1" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.019783 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.019854 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.019977 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-master-centos10/openstack-horizon:current,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n544hf6h88h5b4h646h87h68h586hc4h5c9h567hb9h595hb9h58dh54bh5dhcfh66dhd6h676h54hb9h599h7hfch85hcbh645h56h548h567q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rtc26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6cf65b7d5c-99r5q_openstack(2e4921bb-4cc7-477b-b932-848d0c4d2a09): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.022089 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-horizon:current\\\"\"]" pod="openstack/horizon-6cf65b7d5c-99r5q" podUID="2e4921bb-4cc7-477b-b932-848d0c4d2a09" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.419135 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.419195 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.419333 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bch55h557h66h54bh567hf8h676h587h5dbh54fh557h6h5f5h545h97h546h59dh5bh544h85h59ch658h5c8h664h697h58fh656h9hbch68fh8fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjjmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5f89721a-6d95-46fd-9a8f-d701ccda87b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.508172 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e51333ef-f104-4d0d-a00b-76a2800a425a","Type":"ContainerDied","Data":"10361506f80e036f5020c0e8ca9ddf35792b448c5f6cad45a61b5f9a07988f7f"} Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.508280 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10361506f80e036f5020c0e8ca9ddf35792b448c5f6cad45a61b5f9a07988f7f" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.516235 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e014251-976b-4acd-ac9c-30b14ed29c6c","Type":"ContainerDied","Data":"fa3ae532d40f56a63b7d99aa8c34a1a33c1b1f87b46ca752802fd77ed16e4eaf"} Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.516328 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa3ae532d40f56a63b7d99aa8c34a1a33c1b1f87b46ca752802fd77ed16e4eaf" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.521624 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" event={"ID":"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1","Type":"ContainerDied","Data":"a2b951397f44b0647db257f43474cdca5bf403fb26a963e3fb6ca202705a2c27"} Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.522176 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2b951397f44b0647db257f43474cdca5bf403fb26a963e3fb6ca202705a2c27" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.608919 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.618433 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.625953 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743106 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"e51333ef-f104-4d0d-a00b-76a2800a425a\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743204 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7e014251-976b-4acd-ac9c-30b14ed29c6c\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743239 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-logs\") pod \"e51333ef-f104-4d0d-a00b-76a2800a425a\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743294 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-scripts\") pod \"e51333ef-f104-4d0d-a00b-76a2800a425a\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743324 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-logs\") pod \"7e014251-976b-4acd-ac9c-30b14ed29c6c\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743351 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8tbw\" (UniqueName: \"kubernetes.io/projected/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-kube-api-access-d8tbw\") pod \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743383 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-config-data\") pod \"e51333ef-f104-4d0d-a00b-76a2800a425a\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743422 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-combined-ca-bundle\") pod \"e51333ef-f104-4d0d-a00b-76a2800a425a\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743453 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-nb\") pod \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743527 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcjv4\" (UniqueName: \"kubernetes.io/projected/e51333ef-f104-4d0d-a00b-76a2800a425a-kube-api-access-hcjv4\") pod \"e51333ef-f104-4d0d-a00b-76a2800a425a\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743559 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-scripts\") pod \"7e014251-976b-4acd-ac9c-30b14ed29c6c\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743590 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-config-data\") pod \"7e014251-976b-4acd-ac9c-30b14ed29c6c\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743628 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-sb\") pod \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743661 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-config\") pod \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743736 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-public-tls-certs\") pod \"e51333ef-f104-4d0d-a00b-76a2800a425a\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743772 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc\") pod \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743815 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-httpd-run\") pod \"7e014251-976b-4acd-ac9c-30b14ed29c6c\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743844 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-combined-ca-bundle\") pod \"7e014251-976b-4acd-ac9c-30b14ed29c6c\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743870 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-httpd-run\") pod \"e51333ef-f104-4d0d-a00b-76a2800a425a\" (UID: \"e51333ef-f104-4d0d-a00b-76a2800a425a\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743907 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-swift-storage-0\") pod \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743942 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whfrp\" (UniqueName: \"kubernetes.io/projected/7e014251-976b-4acd-ac9c-30b14ed29c6c-kube-api-access-whfrp\") pod \"7e014251-976b-4acd-ac9c-30b14ed29c6c\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.743964 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-internal-tls-certs\") pod \"7e014251-976b-4acd-ac9c-30b14ed29c6c\" (UID: \"7e014251-976b-4acd-ac9c-30b14ed29c6c\") " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.746807 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-logs" (OuterVolumeSpecName: "logs") pod "7e014251-976b-4acd-ac9c-30b14ed29c6c" (UID: "7e014251-976b-4acd-ac9c-30b14ed29c6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.747561 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "7e014251-976b-4acd-ac9c-30b14ed29c6c" (UID: "7e014251-976b-4acd-ac9c-30b14ed29c6c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.747719 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-logs" (OuterVolumeSpecName: "logs") pod "e51333ef-f104-4d0d-a00b-76a2800a425a" (UID: "e51333ef-f104-4d0d-a00b-76a2800a425a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.747927 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e51333ef-f104-4d0d-a00b-76a2800a425a" (UID: "e51333ef-f104-4d0d-a00b-76a2800a425a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.748167 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "e51333ef-f104-4d0d-a00b-76a2800a425a" (UID: "e51333ef-f104-4d0d-a00b-76a2800a425a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.748186 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7e014251-976b-4acd-ac9c-30b14ed29c6c" (UID: "7e014251-976b-4acd-ac9c-30b14ed29c6c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.760402 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-kube-api-access-d8tbw" (OuterVolumeSpecName: "kube-api-access-d8tbw") pod "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" (UID: "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1"). InnerVolumeSpecName "kube-api-access-d8tbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.763228 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e014251-976b-4acd-ac9c-30b14ed29c6c-kube-api-access-whfrp" (OuterVolumeSpecName: "kube-api-access-whfrp") pod "7e014251-976b-4acd-ac9c-30b14ed29c6c" (UID: "7e014251-976b-4acd-ac9c-30b14ed29c6c"). InnerVolumeSpecName "kube-api-access-whfrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.763383 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-scripts" (OuterVolumeSpecName: "scripts") pod "7e014251-976b-4acd-ac9c-30b14ed29c6c" (UID: "7e014251-976b-4acd-ac9c-30b14ed29c6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.781732 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51333ef-f104-4d0d-a00b-76a2800a425a-kube-api-access-hcjv4" (OuterVolumeSpecName: "kube-api-access-hcjv4") pod "e51333ef-f104-4d0d-a00b-76a2800a425a" (UID: "e51333ef-f104-4d0d-a00b-76a2800a425a"). InnerVolumeSpecName "kube-api-access-hcjv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.783829 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-scripts" (OuterVolumeSpecName: "scripts") pod "e51333ef-f104-4d0d-a00b-76a2800a425a" (UID: "e51333ef-f104-4d0d-a00b-76a2800a425a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.824535 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e014251-976b-4acd-ac9c-30b14ed29c6c" (UID: "7e014251-976b-4acd-ac9c-30b14ed29c6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.842619 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7e014251-976b-4acd-ac9c-30b14ed29c6c" (UID: "7e014251-976b-4acd-ac9c-30b14ed29c6c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.843857 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e51333ef-f104-4d0d-a00b-76a2800a425a" (UID: "e51333ef-f104-4d0d-a00b-76a2800a425a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846237 4956 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846263 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846275 4956 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846284 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846295 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whfrp\" (UniqueName: \"kubernetes.io/projected/7e014251-976b-4acd-ac9c-30b14ed29c6c-kube-api-access-whfrp\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846321 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846336 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846344 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e51333ef-f104-4d0d-a00b-76a2800a425a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846352 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846360 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e014251-976b-4acd-ac9c-30b14ed29c6c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846367 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8tbw\" (UniqueName: \"kubernetes.io/projected/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-kube-api-access-d8tbw\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846375 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846383 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcjv4\" (UniqueName: \"kubernetes.io/projected/e51333ef-f104-4d0d-a00b-76a2800a425a-kube-api-access-hcjv4\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.846392 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.855652 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e51333ef-f104-4d0d-a00b-76a2800a425a" (UID: "e51333ef-f104-4d0d-a00b-76a2800a425a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.862773 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" (UID: "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.871993 4956 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.882983 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-config-data" (OuterVolumeSpecName: "config-data") pod "7e014251-976b-4acd-ac9c-30b14ed29c6c" (UID: "7e014251-976b-4acd-ac9c-30b14ed29c6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.896030 4956 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.897821 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" (UID: "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.906387 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" (UID: "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: E0930 05:45:38.906958 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc podName:eca6601c-9fe9-4e8e-9a47-2d68b02b06e1 nodeName:}" failed. No retries permitted until 2025-09-30 05:45:39.406912747 +0000 UTC m=+1009.734033272 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc") pod "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" (UID: "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1") : error deleting /var/lib/kubelet/pods/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1/volume-subpaths: remove /var/lib/kubelet/pods/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1/volume-subpaths: no such file or directory Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.907080 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-config" (OuterVolumeSpecName: "config") pod "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" (UID: "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.917139 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-config-data" (OuterVolumeSpecName: "config-data") pod "e51333ef-f104-4d0d-a00b-76a2800a425a" (UID: "e51333ef-f104-4d0d-a00b-76a2800a425a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948681 4956 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948719 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948729 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948738 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e014251-976b-4acd-ac9c-30b14ed29c6c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948747 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948756 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948766 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e51333ef-f104-4d0d-a00b-76a2800a425a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948774 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:38 crc kubenswrapper[4956]: I0930 05:45:38.948785 4956 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: E0930 05:45:39.285662 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current" Sep 30 05:45:39 crc kubenswrapper[4956]: E0930 05:45:39.285720 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current" Sep 30 05:45:39 crc kubenswrapper[4956]: E0930 05:45:39.285850 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4p2hv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-cd8q4_openstack(d947526f-907a-4951-bbc4-51e29c560a06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:45:39 crc kubenswrapper[4956]: E0930 05:45:39.287027 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-cd8q4" podUID="d947526f-907a-4951-bbc4-51e29c560a06" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.367749 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.368098 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.374725 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.462165 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc\") pod \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\" (UID: \"eca6601c-9fe9-4e8e-9a47-2d68b02b06e1\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.462228 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-scripts\") pod \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.462280 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgfrx\" (UniqueName: \"kubernetes.io/projected/a55e72da-6cd5-48ac-b98b-ce39235f96f1-kube-api-access-sgfrx\") pod \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463111 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-combined-ca-bundle\") pod \"7c50f741-28f6-4bb3-865e-fc21813c1b00\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463301 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-config-data\") pod \"7c50f741-28f6-4bb3-865e-fc21813c1b00\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463355 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55e72da-6cd5-48ac-b98b-ce39235f96f1-logs\") pod \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463411 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e4921bb-4cc7-477b-b932-848d0c4d2a09-logs\") pod \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463447 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2e4921bb-4cc7-477b-b932-848d0c4d2a09-horizon-secret-key\") pod \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463485 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-custom-prometheus-ca\") pod \"7c50f741-28f6-4bb3-865e-fc21813c1b00\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463526 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-config-data\") pod \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463555 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a55e72da-6cd5-48ac-b98b-ce39235f96f1-horizon-secret-key\") pod \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\" (UID: \"a55e72da-6cd5-48ac-b98b-ce39235f96f1\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463632 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c50f741-28f6-4bb3-865e-fc21813c1b00-logs\") pod \"7c50f741-28f6-4bb3-865e-fc21813c1b00\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463703 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtc26\" (UniqueName: \"kubernetes.io/projected/2e4921bb-4cc7-477b-b932-848d0c4d2a09-kube-api-access-rtc26\") pod \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463735 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-config-data\") pod \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463734 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-scripts" (OuterVolumeSpecName: "scripts") pod "a55e72da-6cd5-48ac-b98b-ce39235f96f1" (UID: "a55e72da-6cd5-48ac-b98b-ce39235f96f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463763 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbx7f\" (UniqueName: \"kubernetes.io/projected/7c50f741-28f6-4bb3-865e-fc21813c1b00-kube-api-access-nbx7f\") pod \"7c50f741-28f6-4bb3-865e-fc21813c1b00\" (UID: \"7c50f741-28f6-4bb3-865e-fc21813c1b00\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.463804 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-scripts\") pod \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\" (UID: \"2e4921bb-4cc7-477b-b932-848d0c4d2a09\") " Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.464233 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" (UID: "eca6601c-9fe9-4e8e-9a47-2d68b02b06e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.464733 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.464755 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.465293 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c50f741-28f6-4bb3-865e-fc21813c1b00-logs" (OuterVolumeSpecName: "logs") pod "7c50f741-28f6-4bb3-865e-fc21813c1b00" (UID: "7c50f741-28f6-4bb3-865e-fc21813c1b00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.466035 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-config-data" (OuterVolumeSpecName: "config-data") pod "a55e72da-6cd5-48ac-b98b-ce39235f96f1" (UID: "a55e72da-6cd5-48ac-b98b-ce39235f96f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.466198 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4921bb-4cc7-477b-b932-848d0c4d2a09-logs" (OuterVolumeSpecName: "logs") pod "2e4921bb-4cc7-477b-b932-848d0c4d2a09" (UID: "2e4921bb-4cc7-477b-b932-848d0c4d2a09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.466698 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55e72da-6cd5-48ac-b98b-ce39235f96f1-logs" (OuterVolumeSpecName: "logs") pod "a55e72da-6cd5-48ac-b98b-ce39235f96f1" (UID: "a55e72da-6cd5-48ac-b98b-ce39235f96f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.468988 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-config-data" (OuterVolumeSpecName: "config-data") pod "2e4921bb-4cc7-477b-b932-848d0c4d2a09" (UID: "2e4921bb-4cc7-477b-b932-848d0c4d2a09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.470191 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-scripts" (OuterVolumeSpecName: "scripts") pod "2e4921bb-4cc7-477b-b932-848d0c4d2a09" (UID: "2e4921bb-4cc7-477b-b932-848d0c4d2a09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.479087 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4921bb-4cc7-477b-b932-848d0c4d2a09-kube-api-access-rtc26" (OuterVolumeSpecName: "kube-api-access-rtc26") pod "2e4921bb-4cc7-477b-b932-848d0c4d2a09" (UID: "2e4921bb-4cc7-477b-b932-848d0c4d2a09"). InnerVolumeSpecName "kube-api-access-rtc26". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.485507 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c50f741-28f6-4bb3-865e-fc21813c1b00-kube-api-access-nbx7f" (OuterVolumeSpecName: "kube-api-access-nbx7f") pod "7c50f741-28f6-4bb3-865e-fc21813c1b00" (UID: "7c50f741-28f6-4bb3-865e-fc21813c1b00"). InnerVolumeSpecName "kube-api-access-nbx7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.487006 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55e72da-6cd5-48ac-b98b-ce39235f96f1-kube-api-access-sgfrx" (OuterVolumeSpecName: "kube-api-access-sgfrx") pod "a55e72da-6cd5-48ac-b98b-ce39235f96f1" (UID: "a55e72da-6cd5-48ac-b98b-ce39235f96f1"). InnerVolumeSpecName "kube-api-access-sgfrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.522276 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a55e72da-6cd5-48ac-b98b-ce39235f96f1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a55e72da-6cd5-48ac-b98b-ce39235f96f1" (UID: "a55e72da-6cd5-48ac-b98b-ce39235f96f1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.522354 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e4921bb-4cc7-477b-b932-848d0c4d2a09-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2e4921bb-4cc7-477b-b932-848d0c4d2a09" (UID: "2e4921bb-4cc7-477b-b932-848d0c4d2a09"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.543961 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dcbfc4c57-z8gm9" event={"ID":"a55e72da-6cd5-48ac-b98b-ce39235f96f1","Type":"ContainerDied","Data":"d3bdcdc270a6f7b5c5a5ea04c9f6ac49f912ee4048311f7dcedfe85e609730ed"} Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.544081 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dcbfc4c57-z8gm9" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.547808 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cf65b7d5c-99r5q" event={"ID":"2e4921bb-4cc7-477b-b932-848d0c4d2a09","Type":"ContainerDied","Data":"dbf497f0d4957cb23dbf081d00f51b2c023c85616f592c5c9b90218b062bcc71"} Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.547912 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cf65b7d5c-99r5q" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.568439 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c50f741-28f6-4bb3-865e-fc21813c1b00" (UID: "7c50f741-28f6-4bb3-865e-fc21813c1b00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.571321 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.572954 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c50f741-28f6-4bb3-865e-fc21813c1b00","Type":"ContainerDied","Data":"d438b85792f9e7a9e2c5ab07a5886697d00f5bd0104894042ecdcfe119080d6d"} Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.573257 4956 scope.go:117] "RemoveContainer" containerID="fdb70915fb7c98a304429f380940af88934d57f1b6e20dc050936830e371b909" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.573427 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579480 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgfrx\" (UniqueName: \"kubernetes.io/projected/a55e72da-6cd5-48ac-b98b-ce39235f96f1-kube-api-access-sgfrx\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579517 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579528 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55e72da-6cd5-48ac-b98b-ce39235f96f1-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579539 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e4921bb-4cc7-477b-b932-848d0c4d2a09-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579551 4956 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2e4921bb-4cc7-477b-b932-848d0c4d2a09-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579559 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a55e72da-6cd5-48ac-b98b-ce39235f96f1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579569 4956 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a55e72da-6cd5-48ac-b98b-ce39235f96f1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579581 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c50f741-28f6-4bb3-865e-fc21813c1b00-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579594 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtc26\" (UniqueName: \"kubernetes.io/projected/2e4921bb-4cc7-477b-b932-848d0c4d2a09-kube-api-access-rtc26\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579603 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579611 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbx7f\" (UniqueName: \"kubernetes.io/projected/7c50f741-28f6-4bb3-865e-fc21813c1b00-kube-api-access-nbx7f\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579620 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e4921bb-4cc7-477b-b932-848d0c4d2a09-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579716 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.579796 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.580342 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "7c50f741-28f6-4bb3-865e-fc21813c1b00" (UID: "7c50f741-28f6-4bb3-865e-fc21813c1b00"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: E0930 05:45:39.592985 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current\\\"\"" pod="openstack/barbican-db-sync-cd8q4" podUID="d947526f-907a-4951-bbc4-51e29c560a06" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.644343 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-config-data" (OuterVolumeSpecName: "config-data") pod "7c50f741-28f6-4bb3-865e-fc21813c1b00" (UID: "7c50f741-28f6-4bb3-865e-fc21813c1b00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.681571 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.681597 4956 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c50f741-28f6-4bb3-865e-fc21813c1b00-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.799360 4956 scope.go:117] "RemoveContainer" containerID="e830732805e896396d587045b8c84fa19caae122664496e4b9cc7b7e02f7b189" Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.917022 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cf65b7d5c-99r5q"] Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.945003 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6cf65b7d5c-99r5q"] Sep 30 05:45:39 crc kubenswrapper[4956]: I0930 05:45:39.973291 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.009499 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.041099 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: E0930 05:45:40.043279 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerName="glance-httpd" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.043314 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerName="glance-httpd" Sep 30 05:45:40 crc kubenswrapper[4956]: E0930 05:45:40.043347 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerName="init" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.043356 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerName="init" Sep 30 05:45:40 crc kubenswrapper[4956]: E0930 05:45:40.043376 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerName="glance-httpd" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.043384 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerName="glance-httpd" Sep 30 05:45:40 crc kubenswrapper[4956]: E0930 05:45:40.043411 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.043421 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api" Sep 30 05:45:40 crc kubenswrapper[4956]: E0930 05:45:40.043449 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api-log" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.043457 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api-log" Sep 30 05:45:40 crc kubenswrapper[4956]: E0930 05:45:40.043479 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerName="dnsmasq-dns" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.043487 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerName="dnsmasq-dns" Sep 30 05:45:40 crc kubenswrapper[4956]: E0930 05:45:40.043522 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerName="glance-log" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.043533 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerName="glance-log" Sep 30 05:45:40 crc kubenswrapper[4956]: E0930 05:45:40.043559 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerName="glance-log" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.043569 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerName="glance-log" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.054453 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerName="glance-log" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.054587 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerName="glance-httpd" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.054630 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.054680 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerName="dnsmasq-dns" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.054728 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" containerName="glance-log" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.054792 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" containerName="glance-httpd" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.054833 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" containerName="watcher-api-log" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.064382 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.086713 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mwhll" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.086926 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.087383 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.087549 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.087621 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dcbfc4c57-z8gm9"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.105324 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7dcbfc4c57-z8gm9"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.118243 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.144523 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.160947 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.171381 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-hwpd5"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.185962 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-hwpd5"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.191996 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-scripts\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.192095 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lw7\" (UniqueName: \"kubernetes.io/projected/a79ac967-4c58-483a-9ef8-76033c9c5d83-kube-api-access-m2lw7\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.192159 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-logs\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.192197 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.192222 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.192261 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.192282 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.192307 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-config-data\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.213726 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.216219 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.225065 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.226882 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.227234 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.246203 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.261585 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.271313 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.273583 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.279261 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.279340 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.279420 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.298127 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lw7\" (UniqueName: \"kubernetes.io/projected/a79ac967-4c58-483a-9ef8-76033c9c5d83-kube-api-access-m2lw7\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.298232 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-logs\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.298292 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.298320 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.298376 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.298396 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.298430 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-config-data\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.298474 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-scripts\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.302182 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.302524 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.308031 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.310723 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-logs\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.310904 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-scripts\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.311339 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.311399 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.317080 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-config-data\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.339981 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68c96b554d-8mtnt"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.342884 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lw7\" (UniqueName: \"kubernetes.io/projected/a79ac967-4c58-483a-9ef8-76033c9c5d83-kube-api-access-m2lw7\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.377439 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.377825 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4921bb-4cc7-477b-b932-848d0c4d2a09" path="/var/lib/kubelet/pods/2e4921bb-4cc7-477b-b932-848d0c4d2a09/volumes" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.382875 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c50f741-28f6-4bb3-865e-fc21813c1b00" path="/var/lib/kubelet/pods/7c50f741-28f6-4bb3-865e-fc21813c1b00/volumes" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.388547 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e014251-976b-4acd-ac9c-30b14ed29c6c" path="/var/lib/kubelet/pods/7e014251-976b-4acd-ac9c-30b14ed29c6c/volumes" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.389327 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a55e72da-6cd5-48ac-b98b-ce39235f96f1" path="/var/lib/kubelet/pods/a55e72da-6cd5-48ac-b98b-ce39235f96f1/volumes" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.389778 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51333ef-f104-4d0d-a00b-76a2800a425a" path="/var/lib/kubelet/pods/e51333ef-f104-4d0d-a00b-76a2800a425a/volumes" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.391099 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" path="/var/lib/kubelet/pods/eca6601c-9fe9-4e8e-9a47-2d68b02b06e1/volumes" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.400312 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.400375 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.400428 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrcpm\" (UniqueName: \"kubernetes.io/projected/7d2aafb3-866b-41ad-bdeb-38e53b81934a-kube-api-access-wrcpm\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.400484 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-config-data\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.400516 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.400538 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.400956 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.401067 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.401110 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g276v\" (UniqueName: \"kubernetes.io/projected/5268a4a4-b72f-47e6-a485-04dcc5935087-kube-api-access-g276v\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.401172 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.401341 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.401425 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5268a4a4-b72f-47e6-a485-04dcc5935087-logs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.401726 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.401749 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.401777 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.438768 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503575 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrcpm\" (UniqueName: \"kubernetes.io/projected/7d2aafb3-866b-41ad-bdeb-38e53b81934a-kube-api-access-wrcpm\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503664 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-config-data\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503703 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503723 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503783 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503811 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503837 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g276v\" (UniqueName: \"kubernetes.io/projected/5268a4a4-b72f-47e6-a485-04dcc5935087-kube-api-access-g276v\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503871 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503912 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.503948 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5268a4a4-b72f-47e6-a485-04dcc5935087-logs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.504021 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.504044 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.504070 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.504130 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.504157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.504470 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.505997 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.510753 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t7llp"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.512193 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.513660 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.515081 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5268a4a4-b72f-47e6-a485-04dcc5935087-logs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.521765 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.521763 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.522214 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.522361 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hvzfv"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.524571 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.525109 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-config-data\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.530769 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.530929 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nnr6k"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.530769 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.531803 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5268a4a4-b72f-47e6-a485-04dcc5935087-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.538510 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f96b888bb-bhtl9"] Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.540423 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrcpm\" (UniqueName: \"kubernetes.io/projected/7d2aafb3-866b-41ad-bdeb-38e53b81934a-kube-api-access-wrcpm\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.552264 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.559594 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g276v\" (UniqueName: \"kubernetes.io/projected/5268a4a4-b72f-47e6-a485-04dcc5935087-kube-api-access-g276v\") pod \"watcher-api-0\" (UID: \"5268a4a4-b72f-47e6-a485-04dcc5935087\") " pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.596725 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"48193dc0-3638-40e3-8b54-2b2049bd5925","Type":"ContainerStarted","Data":"1be10c431d3579cb6999fcf2de4a6df5f16108edb4c5ebb162b0e95615d0c9ec"} Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.600097 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aad83b1a-8ea2-4f43-a6e3-b8e844a65115","Type":"ContainerStarted","Data":"3ac0deeb9dd4989b4419240d3424d601947a4baa044879dc4526b2cdced4bc45"} Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.606035 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dfd95655c-tvpq8" event={"ID":"b25f900c-e486-4143-8ef9-0416286dc2dc","Type":"ContainerStarted","Data":"b305e374a7105939e3230a705ca3319a1e63c3c43ed3f80a90c520632d8e2ec8"} Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.606087 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dfd95655c-tvpq8" event={"ID":"b25f900c-e486-4143-8ef9-0416286dc2dc","Type":"ContainerStarted","Data":"64f339e43dae649aa08f6d71106076c98f8eb556995a4093bf8a403dd78559d3"} Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.606176 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-dfd95655c-tvpq8" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerName="horizon" containerID="cri-o://b305e374a7105939e3230a705ca3319a1e63c3c43ed3f80a90c520632d8e2ec8" gracePeriod=30 Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.606155 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-dfd95655c-tvpq8" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerName="horizon-log" containerID="cri-o://64f339e43dae649aa08f6d71106076c98f8eb556995a4093bf8a403dd78559d3" gracePeriod=30 Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.609750 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68c96b554d-8mtnt" event={"ID":"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95","Type":"ContainerStarted","Data":"39f6fba81bf2338227b34babb54a50886f23dca04dacce224c3d4af146eaea5f"} Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.609793 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68c96b554d-8mtnt" event={"ID":"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95","Type":"ContainerStarted","Data":"ca0b91b0c9cdd07aa2a4a914b6bec221c543d307509862bc0d5da7d55f1c0020"} Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.623273 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.580271466 podStartE2EDuration="20.623251268s" podCreationTimestamp="2025-09-30 05:45:20 +0000 UTC" firstStartedPulling="2025-09-30 05:45:21.41971906 +0000 UTC m=+991.746839585" lastFinishedPulling="2025-09-30 05:45:38.462698852 +0000 UTC m=+1008.789819387" observedRunningTime="2025-09-30 05:45:40.616181165 +0000 UTC m=+1010.943301700" watchObservedRunningTime="2025-09-30 05:45:40.623251268 +0000 UTC m=+1010.950371803" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.628382 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dqs9m" event={"ID":"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5","Type":"ContainerStarted","Data":"60d50e8397fd199fc2a8700196daf630ff9b71e0e7bc2f21f6d16fcfc823b648"} Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.636967 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.363174322 podStartE2EDuration="20.636950629s" podCreationTimestamp="2025-09-30 05:45:20 +0000 UTC" firstStartedPulling="2025-09-30 05:45:21.127058939 +0000 UTC m=+991.454179464" lastFinishedPulling="2025-09-30 05:45:38.400835246 +0000 UTC m=+1008.727955771" observedRunningTime="2025-09-30 05:45:40.633301343 +0000 UTC m=+1010.960421878" watchObservedRunningTime="2025-09-30 05:45:40.636950629 +0000 UTC m=+1010.964071164" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.637432 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.646942 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.646996 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.659136 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-dfd95655c-tvpq8" podStartSLOduration=3.134324028 podStartE2EDuration="23.659100304s" podCreationTimestamp="2025-09-30 05:45:17 +0000 UTC" firstStartedPulling="2025-09-30 05:45:18.876604746 +0000 UTC m=+989.203725271" lastFinishedPulling="2025-09-30 05:45:39.401381032 +0000 UTC m=+1009.728501547" observedRunningTime="2025-09-30 05:45:40.651677412 +0000 UTC m=+1010.978797937" watchObservedRunningTime="2025-09-30 05:45:40.659100304 +0000 UTC m=+1010.986220819" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.681018 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dqs9m" podStartSLOduration=4.14745056 podStartE2EDuration="23.680991963s" podCreationTimestamp="2025-09-30 05:45:17 +0000 UTC" firstStartedPulling="2025-09-30 05:45:18.921802117 +0000 UTC m=+989.248922642" lastFinishedPulling="2025-09-30 05:45:38.45534352 +0000 UTC m=+1008.782464045" observedRunningTime="2025-09-30 05:45:40.666669263 +0000 UTC m=+1010.993789808" watchObservedRunningTime="2025-09-30 05:45:40.680991963 +0000 UTC m=+1011.008112488" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.688565 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Sep 30 05:45:40 crc kubenswrapper[4956]: I0930 05:45:40.702215 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 05:45:40 crc kubenswrapper[4956]: W0930 05:45:40.984387 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1b07e53_1ac0_4958_8ce3_b43ed35fbdef.slice/crio-b0ff49c92bf15cf118ee74cdc20bf178aca60831f8bf4cebd5d158f249c773e3 WatchSource:0}: Error finding container b0ff49c92bf15cf118ee74cdc20bf178aca60831f8bf4cebd5d158f249c773e3: Status 404 returned error can't find the container with id b0ff49c92bf15cf118ee74cdc20bf178aca60831f8bf4cebd5d158f249c773e3 Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.672542 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.694776 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f89721a-6d95-46fd-9a8f-d701ccda87b8","Type":"ContainerStarted","Data":"e3cfd7068ca0b94e06643d04d3ee963e83a977de8c130820e50fda945ba0a2aa"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.697717 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68c96b554d-8mtnt" event={"ID":"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95","Type":"ContainerStarted","Data":"7eafd8dbb682893d7dec3183ccf8ede2bfcfc62744d82946285977f5c6f648d8"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.703548 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.732097 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f96b888bb-bhtl9" event={"ID":"f29ac7f2-13b9-47d8-9218-fb08840e6704","Type":"ContainerStarted","Data":"f014a4da75fdac2a8b3d01335c1b1799b7a2fa57558cef8ca1dce85700a1159e"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.732161 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f96b888bb-bhtl9" event={"ID":"f29ac7f2-13b9-47d8-9218-fb08840e6704","Type":"ContainerStarted","Data":"e682ce754c9cbd62db27adb02e6aa4e157cd76c9bf92fdfe0271e0c2a918a7f5"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.733857 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68c96b554d-8mtnt" podStartSLOduration=15.733835754 podStartE2EDuration="15.733835754s" podCreationTimestamp="2025-09-30 05:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:41.719699129 +0000 UTC m=+1012.046819654" watchObservedRunningTime="2025-09-30 05:45:41.733835754 +0000 UTC m=+1012.060956299" Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.739600 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nnr6k" event={"ID":"740de0e5-c3e9-43bc-bb01-8c240f50070e","Type":"ContainerStarted","Data":"a930588477252f5460e9259b0c01523b245f657a647303988300021b7118ec82"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.742725 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t7llp" event={"ID":"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef","Type":"ContainerStarted","Data":"ec2d7b018b094efac24191b05f52c1eed8ac1b7d18057ad473bffafb88b0a3ca"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.742755 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t7llp" event={"ID":"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef","Type":"ContainerStarted","Data":"b0ff49c92bf15cf118ee74cdc20bf178aca60831f8bf4cebd5d158f249c773e3"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.749922 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hvzfv" event={"ID":"d7d95941-d95f-4302-84f1-9230a7b9001f","Type":"ContainerStarted","Data":"de0db8d3f713cb626f0e9868ee39ba9e2ca3744a472c2f70318349ecca3f4c47"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.749958 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hvzfv" event={"ID":"d7d95941-d95f-4302-84f1-9230a7b9001f","Type":"ContainerStarted","Data":"c897cbf71fa3097200e5f5e44fe8e696d2f565ca20b26a8dd66967e0517d986b"} Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.769940 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t7llp" podStartSLOduration=10.769923348 podStartE2EDuration="10.769923348s" podCreationTimestamp="2025-09-30 05:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:41.764621621 +0000 UTC m=+1012.091742156" watchObservedRunningTime="2025-09-30 05:45:41.769923348 +0000 UTC m=+1012.097043873" Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.794979 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.798074 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hvzfv" podStartSLOduration=10.798057073 podStartE2EDuration="10.798057073s" podCreationTimestamp="2025-09-30 05:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:41.780615325 +0000 UTC m=+1012.107735850" watchObservedRunningTime="2025-09-30 05:45:41.798057073 +0000 UTC m=+1012.125177598" Sep 30 05:45:41 crc kubenswrapper[4956]: W0930 05:45:41.817924 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79ac967_4c58_483a_9ef8_76033c9c5d83.slice/crio-06668260a629a8ef55be2a234e2b6c86c96d714d98fca89ad5f346989dfa46a4 WatchSource:0}: Error finding container 06668260a629a8ef55be2a234e2b6c86c96d714d98fca89ad5f346989dfa46a4: Status 404 returned error can't find the container with id 06668260a629a8ef55be2a234e2b6c86c96d714d98fca89ad5f346989dfa46a4 Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.818307 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Sep 30 05:45:41 crc kubenswrapper[4956]: I0930 05:45:41.939649 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55b99bf79c-hwpd5" podUID="eca6601c-9fe9-4e8e-9a47-2d68b02b06e1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.786163 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f96b888bb-bhtl9" event={"ID":"f29ac7f2-13b9-47d8-9218-fb08840e6704","Type":"ContainerStarted","Data":"c457b7ed3b8368c13a448fc15f51578e18936cfd3a5eb1ac4142d1b423677d5d"} Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.792804 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5268a4a4-b72f-47e6-a485-04dcc5935087","Type":"ContainerStarted","Data":"491d2fe250f5bd031dcdbe3d3d371269a52350bda8f2229a599abe30b24a81b6"} Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.792878 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5268a4a4-b72f-47e6-a485-04dcc5935087","Type":"ContainerStarted","Data":"b40aae33defa5047f9c5286f30546d43989a9d264891fde4eab03c3038b617d1"} Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.792893 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5268a4a4-b72f-47e6-a485-04dcc5935087","Type":"ContainerStarted","Data":"29e140e73ee49a100764ce6f93c3ab3b818353bec5ca9586423d7fe1390f3c91"} Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.793542 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.796866 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d2aafb3-866b-41ad-bdeb-38e53b81934a","Type":"ContainerStarted","Data":"2742293e6b987c75fb0fe5643f3cd791e1510d598275d9a695949f2e6f04fb51"} Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.800102 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a79ac967-4c58-483a-9ef8-76033c9c5d83","Type":"ContainerStarted","Data":"06668260a629a8ef55be2a234e2b6c86c96d714d98fca89ad5f346989dfa46a4"} Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.834499 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5f96b888bb-bhtl9" podStartSLOduration=16.834475908 podStartE2EDuration="16.834475908s" podCreationTimestamp="2025-09-30 05:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:42.809596265 +0000 UTC m=+1013.136716810" watchObservedRunningTime="2025-09-30 05:45:42.834475908 +0000 UTC m=+1013.161596433" Sep 30 05:45:42 crc kubenswrapper[4956]: I0930 05:45:42.855192 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.855170748 podStartE2EDuration="3.855170748s" podCreationTimestamp="2025-09-30 05:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:42.832691412 +0000 UTC m=+1013.159811957" watchObservedRunningTime="2025-09-30 05:45:42.855170748 +0000 UTC m=+1013.182291273" Sep 30 05:45:43 crc kubenswrapper[4956]: I0930 05:45:43.830241 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a79ac967-4c58-483a-9ef8-76033c9c5d83","Type":"ContainerStarted","Data":"7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263"} Sep 30 05:45:43 crc kubenswrapper[4956]: I0930 05:45:43.836913 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d2aafb3-866b-41ad-bdeb-38e53b81934a","Type":"ContainerStarted","Data":"b1350995cd78544dc325bbdb0c09a57997f0575e9bbf4b8a54b502be3f92321e"} Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.852299 4956 generic.go:334] "Generic (PLEG): container finished" podID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" containerID="3ac0deeb9dd4989b4419240d3424d601947a4baa044879dc4526b2cdced4bc45" exitCode=1 Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.852392 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aad83b1a-8ea2-4f43-a6e3-b8e844a65115","Type":"ContainerDied","Data":"3ac0deeb9dd4989b4419240d3424d601947a4baa044879dc4526b2cdced4bc45"} Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.855676 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d2aafb3-866b-41ad-bdeb-38e53b81934a","Type":"ContainerStarted","Data":"ea0b8f4e921fbd058a0594225cbca024b0346555bbd3a1c2aa75e707852254a2"} Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.856549 4956 scope.go:117] "RemoveContainer" containerID="3ac0deeb9dd4989b4419240d3424d601947a4baa044879dc4526b2cdced4bc45" Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.858856 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a79ac967-4c58-483a-9ef8-76033c9c5d83","Type":"ContainerStarted","Data":"f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05"} Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.868865 4956 generic.go:334] "Generic (PLEG): container finished" podID="a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" containerID="60d50e8397fd199fc2a8700196daf630ff9b71e0e7bc2f21f6d16fcfc823b648" exitCode=0 Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.868930 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dqs9m" event={"ID":"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5","Type":"ContainerDied","Data":"60d50e8397fd199fc2a8700196daf630ff9b71e0e7bc2f21f6d16fcfc823b648"} Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.913029 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.912994055 podStartE2EDuration="5.912994055s" podCreationTimestamp="2025-09-30 05:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:44.900578914 +0000 UTC m=+1015.227699429" watchObservedRunningTime="2025-09-30 05:45:44.912994055 +0000 UTC m=+1015.240114580" Sep 30 05:45:44 crc kubenswrapper[4956]: I0930 05:45:44.922188 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.922166783 podStartE2EDuration="5.922166783s" podCreationTimestamp="2025-09-30 05:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:44.921532243 +0000 UTC m=+1015.248652768" watchObservedRunningTime="2025-09-30 05:45:44.922166783 +0000 UTC m=+1015.249287308" Sep 30 05:45:45 crc kubenswrapper[4956]: I0930 05:45:45.703398 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 05:45:45 crc kubenswrapper[4956]: I0930 05:45:45.704225 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:45:45 crc kubenswrapper[4956]: I0930 05:45:45.894615 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aad83b1a-8ea2-4f43-a6e3-b8e844a65115","Type":"ContainerStarted","Data":"2fa0d064ca663afac0ef8c15ff4e0f9d4ad6547c7ed2d0d784b1c82d476b062e"} Sep 30 05:45:45 crc kubenswrapper[4956]: I0930 05:45:45.933405 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 05:45:46 crc kubenswrapper[4956]: I0930 05:45:46.784491 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:46 crc kubenswrapper[4956]: I0930 05:45:46.784869 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:46 crc kubenswrapper[4956]: I0930 05:45:46.888217 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:46 crc kubenswrapper[4956]: I0930 05:45:46.888258 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:45:48 crc kubenswrapper[4956]: I0930 05:45:48.026548 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:45:48 crc kubenswrapper[4956]: I0930 05:45:48.937754 4956 generic.go:334] "Generic (PLEG): container finished" podID="c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" containerID="ec2d7b018b094efac24191b05f52c1eed8ac1b7d18057ad473bffafb88b0a3ca" exitCode=0 Sep 30 05:45:48 crc kubenswrapper[4956]: I0930 05:45:48.937817 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t7llp" event={"ID":"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef","Type":"ContainerDied","Data":"ec2d7b018b094efac24191b05f52c1eed8ac1b7d18057ad473bffafb88b0a3ca"} Sep 30 05:45:49 crc kubenswrapper[4956]: I0930 05:45:49.959524 4956 generic.go:334] "Generic (PLEG): container finished" podID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" containerID="2fa0d064ca663afac0ef8c15ff4e0f9d4ad6547c7ed2d0d784b1c82d476b062e" exitCode=1 Sep 30 05:45:49 crc kubenswrapper[4956]: I0930 05:45:49.959616 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aad83b1a-8ea2-4f43-a6e3-b8e844a65115","Type":"ContainerDied","Data":"2fa0d064ca663afac0ef8c15ff4e0f9d4ad6547c7ed2d0d784b1c82d476b062e"} Sep 30 05:45:49 crc kubenswrapper[4956]: I0930 05:45:49.960759 4956 scope.go:117] "RemoveContainer" containerID="3ac0deeb9dd4989b4419240d3424d601947a4baa044879dc4526b2cdced4bc45" Sep 30 05:45:49 crc kubenswrapper[4956]: I0930 05:45:49.961864 4956 scope.go:117] "RemoveContainer" containerID="2fa0d064ca663afac0ef8c15ff4e0f9d4ad6547c7ed2d0d784b1c82d476b062e" Sep 30 05:45:49 crc kubenswrapper[4956]: E0930 05:45:49.962196 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aad83b1a-8ea2-4f43-a6e3-b8e844a65115)\"" pod="openstack/watcher-decision-engine-0" podUID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.439574 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.440034 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.469421 4956 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podbca37042-172f-42db-ac83-85a5872720df"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podbca37042-172f-42db-ac83-85a5872720df] : Timed out while waiting for systemd to remove kubepods-besteffort-podbca37042_172f_42db_ac83_85a5872720df.slice" Sep 30 05:45:50 crc kubenswrapper[4956]: E0930 05:45:50.469469 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podbca37042-172f-42db-ac83-85a5872720df] : unable to destroy cgroup paths for cgroup [kubepods besteffort podbca37042-172f-42db-ac83-85a5872720df] : Timed out while waiting for systemd to remove kubepods-besteffort-podbca37042_172f_42db_ac83_85a5872720df.slice" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" podUID="bca37042-172f-42db-ac83-85a5872720df" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.492453 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.492567 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.530164 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.530228 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.530240 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.530250 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.638066 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.638342 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.688502 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.689031 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.704093 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.728715 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.803270 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.826452 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916108 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-logs\") pod \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916224 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-combined-ca-bundle\") pod \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916275 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlvpm\" (UniqueName: \"kubernetes.io/projected/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-kube-api-access-rlvpm\") pod \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916304 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-scripts\") pod \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916345 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-config-data\") pod \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916381 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh5l5\" (UniqueName: \"kubernetes.io/projected/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-kube-api-access-jh5l5\") pod \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916398 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-fernet-keys\") pod \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916423 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-combined-ca-bundle\") pod \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916451 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-config-data\") pod \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\" (UID: \"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916475 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-credential-keys\") pod \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.916491 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-scripts\") pod \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\" (UID: \"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef\") " Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.921183 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-scripts" (OuterVolumeSpecName: "scripts") pod "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" (UID: "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.922710 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-logs" (OuterVolumeSpecName: "logs") pod "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" (UID: "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.927365 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-scripts" (OuterVolumeSpecName: "scripts") pod "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" (UID: "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.928082 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-kube-api-access-rlvpm" (OuterVolumeSpecName: "kube-api-access-rlvpm") pod "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" (UID: "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5"). InnerVolumeSpecName "kube-api-access-rlvpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.929349 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" (UID: "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.930300 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" (UID: "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.939371 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-kube-api-access-jh5l5" (OuterVolumeSpecName: "kube-api-access-jh5l5") pod "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" (UID: "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef"). InnerVolumeSpecName "kube-api-access-jh5l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.970238 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" (UID: "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.991065 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-config-data" (OuterVolumeSpecName: "config-data") pod "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" (UID: "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:50 crc kubenswrapper[4956]: I0930 05:45:50.993229 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" (UID: "c1b07e53-1ac0-4958-8ce3-b43ed35fbdef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.010455 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-config-data" (OuterVolumeSpecName: "config-data") pod "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" (UID: "a6c94959-3c58-4a09-b7ee-bb13a3f82fb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.016018 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dqs9m" event={"ID":"a6c94959-3c58-4a09-b7ee-bb13a3f82fb5","Type":"ContainerDied","Data":"46a63576fa1a218f0cbfeac73deded86a7623ef743500774d76c82a057d7c33b"} Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.016068 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46a63576fa1a218f0cbfeac73deded86a7623ef743500774d76c82a057d7c33b" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.016034 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dqs9m" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019440 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019461 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlvpm\" (UniqueName: \"kubernetes.io/projected/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-kube-api-access-rlvpm\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019471 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019482 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019494 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh5l5\" (UniqueName: \"kubernetes.io/projected/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-kube-api-access-jh5l5\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019502 4956 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019512 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019520 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019528 4956 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019536 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.019543 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.023381 4956 scope.go:117] "RemoveContainer" containerID="2fa0d064ca663afac0ef8c15ff4e0f9d4ad6547c7ed2d0d784b1c82d476b062e" Sep 30 05:45:51 crc kubenswrapper[4956]: E0930 05:45:51.023759 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aad83b1a-8ea2-4f43-a6e3-b8e844a65115)\"" pod="openstack/watcher-decision-engine-0" podUID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.026237 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t7llp" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.026323 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t7llp" event={"ID":"c1b07e53-1ac0-4958-8ce3-b43ed35fbdef","Type":"ContainerDied","Data":"b0ff49c92bf15cf118ee74cdc20bf178aca60831f8bf4cebd5d158f249c773e3"} Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.026357 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0ff49c92bf15cf118ee74cdc20bf178aca60831f8bf4cebd5d158f249c773e3" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.028633 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd98f6d57-fztws" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.029780 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.029821 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.030162 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.030183 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.090096 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.136678 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dd98f6d57-fztws"] Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.171519 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dd98f6d57-fztws"] Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.230036 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6798cf9d78-m426q"] Sep 30 05:45:51 crc kubenswrapper[4956]: E0930 05:45:51.269679 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" containerName="placement-db-sync" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.269731 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" containerName="placement-db-sync" Sep 30 05:45:51 crc kubenswrapper[4956]: E0930 05:45:51.269788 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" containerName="keystone-bootstrap" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.269795 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" containerName="keystone-bootstrap" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.270393 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" containerName="keystone-bootstrap" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.270415 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" containerName="placement-db-sync" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.271324 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6798cf9d78-m426q"] Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.271426 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.273801 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.274054 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.274651 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.275309 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.285399 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.296162 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n4lp7" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.373591 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-combined-ca-bundle\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.373660 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-scripts\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.373719 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-fernet-keys\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.373750 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-config-data\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.373820 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55qv7\" (UniqueName: \"kubernetes.io/projected/1997765b-9597-4f93-a11a-8df4f572dee4-kube-api-access-55qv7\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.373846 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-credential-keys\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.373869 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-internal-tls-certs\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.373939 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-public-tls-certs\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.476597 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-credential-keys\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.476651 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-internal-tls-certs\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.476700 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-public-tls-certs\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.476733 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-combined-ca-bundle\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.476754 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-scripts\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.476811 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-fernet-keys\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.476834 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-config-data\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.476883 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55qv7\" (UniqueName: \"kubernetes.io/projected/1997765b-9597-4f93-a11a-8df4f572dee4-kube-api-access-55qv7\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.490018 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-fernet-keys\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.490162 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-credential-keys\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.491724 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-combined-ca-bundle\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.493361 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-public-tls-certs\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.495250 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-config-data\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.495655 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-internal-tls-certs\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.496486 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1997765b-9597-4f93-a11a-8df4f572dee4-scripts\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.497554 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55qv7\" (UniqueName: \"kubernetes.io/projected/1997765b-9597-4f93-a11a-8df4f572dee4-kube-api-access-55qv7\") pod \"keystone-6798cf9d78-m426q\" (UID: \"1997765b-9597-4f93-a11a-8df4f572dee4\") " pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.644897 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.940283 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-654d9b45dd-f9lqj"] Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.942633 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.949970 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.950193 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.950375 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.950494 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.956415 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pp5qx" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.972411 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-654d9b45dd-f9lqj"] Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.994078 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-config-data\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.994149 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e5ec46-ef77-490f-b564-b3d4426dd9c8-logs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.994176 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-internal-tls-certs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.994266 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-combined-ca-bundle\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.994340 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-public-tls-certs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.994405 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-scripts\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:51 crc kubenswrapper[4956]: I0930 05:45:51.994490 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2szrd\" (UniqueName: \"kubernetes.io/projected/17e5ec46-ef77-490f-b564-b3d4426dd9c8-kube-api-access-2szrd\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.048940 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f89721a-6d95-46fd-9a8f-d701ccda87b8","Type":"ContainerStarted","Data":"c8b29d95cf0db0cde67d5a4275b60e218d247a669e802878cb96c0a7ca4fafa3"} Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.051965 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cd8q4" event={"ID":"d947526f-907a-4951-bbc4-51e29c560a06","Type":"ContainerStarted","Data":"60c9efbf05be74f8df8e8f8e01f3abcc4fc69208616b8113d59a7436b44ac4da"} Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.053430 4956 scope.go:117] "RemoveContainer" containerID="2fa0d064ca663afac0ef8c15ff4e0f9d4ad6547c7ed2d0d784b1c82d476b062e" Sep 30 05:45:52 crc kubenswrapper[4956]: E0930 05:45:52.053680 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aad83b1a-8ea2-4f43-a6e3-b8e844a65115)\"" pod="openstack/watcher-decision-engine-0" podUID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.087854 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cd8q4" podStartSLOduration=2.643510665 podStartE2EDuration="31.087824446s" podCreationTimestamp="2025-09-30 05:45:21 +0000 UTC" firstStartedPulling="2025-09-30 05:45:22.366818617 +0000 UTC m=+992.693939152" lastFinishedPulling="2025-09-30 05:45:50.811132408 +0000 UTC m=+1021.138252933" observedRunningTime="2025-09-30 05:45:52.07075027 +0000 UTC m=+1022.397870795" watchObservedRunningTime="2025-09-30 05:45:52.087824446 +0000 UTC m=+1022.414944971" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.096544 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-public-tls-certs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.096625 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-scripts\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.096686 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2szrd\" (UniqueName: \"kubernetes.io/projected/17e5ec46-ef77-490f-b564-b3d4426dd9c8-kube-api-access-2szrd\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.096724 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-config-data\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.096759 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e5ec46-ef77-490f-b564-b3d4426dd9c8-logs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.096782 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-internal-tls-certs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.096811 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-combined-ca-bundle\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.111824 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e5ec46-ef77-490f-b564-b3d4426dd9c8-logs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.127364 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2szrd\" (UniqueName: \"kubernetes.io/projected/17e5ec46-ef77-490f-b564-b3d4426dd9c8-kube-api-access-2szrd\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.127911 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-scripts\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.128774 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-public-tls-certs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.137611 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-internal-tls-certs\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.149455 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-config-data\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.151006 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e5ec46-ef77-490f-b564-b3d4426dd9c8-combined-ca-bundle\") pod \"placement-654d9b45dd-f9lqj\" (UID: \"17e5ec46-ef77-490f-b564-b3d4426dd9c8\") " pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.194057 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6798cf9d78-m426q"] Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.284884 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.370815 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca37042-172f-42db-ac83-85a5872720df" path="/var/lib/kubelet/pods/bca37042-172f-42db-ac83-85a5872720df/volumes" Sep 30 05:45:52 crc kubenswrapper[4956]: I0930 05:45:52.875350 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-654d9b45dd-f9lqj"] Sep 30 05:45:53 crc kubenswrapper[4956]: I0930 05:45:53.081584 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6798cf9d78-m426q" event={"ID":"1997765b-9597-4f93-a11a-8df4f572dee4","Type":"ContainerStarted","Data":"a281c36c2251b451eca6fefd2ecbc088a7974ba55e2de750b31fb6c161db4a9d"} Sep 30 05:45:53 crc kubenswrapper[4956]: I0930 05:45:53.082501 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6798cf9d78-m426q" event={"ID":"1997765b-9597-4f93-a11a-8df4f572dee4","Type":"ContainerStarted","Data":"b55cd5d3dd9d2035975f031a9265684d4b941423e14b4f99f0e3ab169445877d"} Sep 30 05:45:53 crc kubenswrapper[4956]: I0930 05:45:53.082920 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:45:53 crc kubenswrapper[4956]: I0930 05:45:53.117456 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-654d9b45dd-f9lqj" event={"ID":"17e5ec46-ef77-490f-b564-b3d4426dd9c8","Type":"ContainerStarted","Data":"184e4f9771b7d7437dac91c3baa1dcd3af7c42a2994ff6309d2ca06b502b6f80"} Sep 30 05:45:53 crc kubenswrapper[4956]: I0930 05:45:53.118743 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6798cf9d78-m426q" podStartSLOduration=2.118723287 podStartE2EDuration="2.118723287s" podCreationTimestamp="2025-09-30 05:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:53.113910256 +0000 UTC m=+1023.441030791" watchObservedRunningTime="2025-09-30 05:45:53.118723287 +0000 UTC m=+1023.445843812" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.144548 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-654d9b45dd-f9lqj" event={"ID":"17e5ec46-ef77-490f-b564-b3d4426dd9c8","Type":"ContainerStarted","Data":"0d684dc392f7ac35abb008f57d9caf2c619e71a587f5cbc15813f70f07b97d03"} Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.145072 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.145094 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.145105 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-654d9b45dd-f9lqj" event={"ID":"17e5ec46-ef77-490f-b564-b3d4426dd9c8","Type":"ContainerStarted","Data":"b1277c6365fc704a0f0fe6de0280d88abb97312c4076d7c67c7e6e0e6867e025"} Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.168226 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-654d9b45dd-f9lqj" podStartSLOduration=3.168202062 podStartE2EDuration="3.168202062s" podCreationTimestamp="2025-09-30 05:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:45:54.166489858 +0000 UTC m=+1024.493610383" watchObservedRunningTime="2025-09-30 05:45:54.168202062 +0000 UTC m=+1024.495322587" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.232336 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.232461 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.244444 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.244590 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.365091 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 05:45:54 crc kubenswrapper[4956]: I0930 05:45:54.573073 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 05:45:57 crc kubenswrapper[4956]: I0930 05:45:57.172993 4956 generic.go:334] "Generic (PLEG): container finished" podID="d947526f-907a-4951-bbc4-51e29c560a06" containerID="60c9efbf05be74f8df8e8f8e01f3abcc4fc69208616b8113d59a7436b44ac4da" exitCode=0 Sep 30 05:45:57 crc kubenswrapper[4956]: I0930 05:45:57.173042 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cd8q4" event={"ID":"d947526f-907a-4951-bbc4-51e29c560a06","Type":"ContainerDied","Data":"60c9efbf05be74f8df8e8f8e01f3abcc4fc69208616b8113d59a7436b44ac4da"} Sep 30 05:45:58 crc kubenswrapper[4956]: I0930 05:45:58.642018 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:45:58 crc kubenswrapper[4956]: I0930 05:45:58.798151 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:46:00 crc kubenswrapper[4956]: I0930 05:46:00.298550 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:46:00 crc kubenswrapper[4956]: I0930 05:46:00.605067 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5f96b888bb-bhtl9" Sep 30 05:46:00 crc kubenswrapper[4956]: I0930 05:46:00.722193 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68c96b554d-8mtnt"] Sep 30 05:46:01 crc kubenswrapper[4956]: I0930 05:46:01.214667 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68c96b554d-8mtnt" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon-log" containerID="cri-o://39f6fba81bf2338227b34babb54a50886f23dca04dacce224c3d4af146eaea5f" gracePeriod=30 Sep 30 05:46:01 crc kubenswrapper[4956]: I0930 05:46:01.214707 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68c96b554d-8mtnt" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon" containerID="cri-o://7eafd8dbb682893d7dec3183ccf8ede2bfcfc62744d82946285977f5c6f648d8" gracePeriod=30 Sep 30 05:46:02 crc kubenswrapper[4956]: I0930 05:46:02.228908 4956 generic.go:334] "Generic (PLEG): container finished" podID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerID="7eafd8dbb682893d7dec3183ccf8ede2bfcfc62744d82946285977f5c6f648d8" exitCode=0 Sep 30 05:46:02 crc kubenswrapper[4956]: I0930 05:46:02.228959 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68c96b554d-8mtnt" event={"ID":"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95","Type":"ContainerDied","Data":"7eafd8dbb682893d7dec3183ccf8ede2bfcfc62744d82946285977f5c6f648d8"} Sep 30 05:46:05 crc kubenswrapper[4956]: I0930 05:46:05.340849 4956 scope.go:117] "RemoveContainer" containerID="2fa0d064ca663afac0ef8c15ff4e0f9d4ad6547c7ed2d0d784b1c82d476b062e" Sep 30 05:46:06 crc kubenswrapper[4956]: I0930 05:46:06.268761 4956 generic.go:334] "Generic (PLEG): container finished" podID="d7d95941-d95f-4302-84f1-9230a7b9001f" containerID="de0db8d3f713cb626f0e9868ee39ba9e2ca3744a472c2f70318349ecca3f4c47" exitCode=0 Sep 30 05:46:06 crc kubenswrapper[4956]: I0930 05:46:06.268806 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hvzfv" event={"ID":"d7d95941-d95f-4302-84f1-9230a7b9001f","Type":"ContainerDied","Data":"de0db8d3f713cb626f0e9868ee39ba9e2ca3744a472c2f70318349ecca3f4c47"} Sep 30 05:46:06 crc kubenswrapper[4956]: I0930 05:46:06.785260 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68c96b554d-8mtnt" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.524182 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.631231 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-combined-ca-bundle\") pod \"d947526f-907a-4951-bbc4-51e29c560a06\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.631328 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p2hv\" (UniqueName: \"kubernetes.io/projected/d947526f-907a-4951-bbc4-51e29c560a06-kube-api-access-4p2hv\") pod \"d947526f-907a-4951-bbc4-51e29c560a06\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.631481 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-db-sync-config-data\") pod \"d947526f-907a-4951-bbc4-51e29c560a06\" (UID: \"d947526f-907a-4951-bbc4-51e29c560a06\") " Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.637016 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d947526f-907a-4951-bbc4-51e29c560a06" (UID: "d947526f-907a-4951-bbc4-51e29c560a06"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.645329 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d947526f-907a-4951-bbc4-51e29c560a06-kube-api-access-4p2hv" (OuterVolumeSpecName: "kube-api-access-4p2hv") pod "d947526f-907a-4951-bbc4-51e29c560a06" (UID: "d947526f-907a-4951-bbc4-51e29c560a06"). InnerVolumeSpecName "kube-api-access-4p2hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.666523 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d947526f-907a-4951-bbc4-51e29c560a06" (UID: "d947526f-907a-4951-bbc4-51e29c560a06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.733789 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.733824 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d947526f-907a-4951-bbc4-51e29c560a06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:07 crc kubenswrapper[4956]: I0930 05:46:07.733836 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p2hv\" (UniqueName: \"kubernetes.io/projected/d947526f-907a-4951-bbc4-51e29c560a06-kube-api-access-4p2hv\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:07 crc kubenswrapper[4956]: E0930 05:46:07.961599 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48" Sep 30 05:46:07 crc kubenswrapper[4956]: E0930 05:46:07.961820 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjjmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5f89721a-6d95-46fd-9a8f-d701ccda87b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 05:46:07 crc kubenswrapper[4956]: E0930 05:46:07.963345 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.294366 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerName="ceilometer-notification-agent" containerID="cri-o://e3cfd7068ca0b94e06643d04d3ee963e83a977de8c130820e50fda945ba0a2aa" gracePeriod=30 Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.294942 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cd8q4" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.294937 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cd8q4" event={"ID":"d947526f-907a-4951-bbc4-51e29c560a06","Type":"ContainerDied","Data":"7a6b62a66a44cb3ad97d416b1cf493838df5729109a80aa88ac0d7304c7598b1"} Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.294987 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a6b62a66a44cb3ad97d416b1cf493838df5729109a80aa88ac0d7304c7598b1" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.295499 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerName="sg-core" containerID="cri-o://c8b29d95cf0db0cde67d5a4275b60e218d247a669e802878cb96c0a7ca4fafa3" gracePeriod=30 Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.813180 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-96cd79c6-2b9nr"] Sep 30 05:46:08 crc kubenswrapper[4956]: E0930 05:46:08.813589 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d947526f-907a-4951-bbc4-51e29c560a06" containerName="barbican-db-sync" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.813606 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d947526f-907a-4951-bbc4-51e29c560a06" containerName="barbican-db-sync" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.813817 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d947526f-907a-4951-bbc4-51e29c560a06" containerName="barbican-db-sync" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.814798 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.822011 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56bb449d8f-bh9xp"] Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.823649 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.834137 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.842311 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.842465 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.842595 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vhp84" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872174 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxk28\" (UniqueName: \"kubernetes.io/projected/6c9006da-cf02-4bed-8247-02b6e929ff98-kube-api-access-pxk28\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872259 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-combined-ca-bundle\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872287 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c9006da-cf02-4bed-8247-02b6e929ff98-logs\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872306 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc0eca45-f776-4d77-8589-a2605f824696-logs\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872327 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-combined-ca-bundle\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872347 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-config-data-custom\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872409 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-config-data\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872449 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-config-data\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872493 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4s5h\" (UniqueName: \"kubernetes.io/projected/bc0eca45-f776-4d77-8589-a2605f824696-kube-api-access-j4s5h\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.872526 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-config-data-custom\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.873412 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-96cd79c6-2b9nr"] Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.894853 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56bb449d8f-bh9xp"] Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.973710 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-config-data-custom\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974026 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxk28\" (UniqueName: \"kubernetes.io/projected/6c9006da-cf02-4bed-8247-02b6e929ff98-kube-api-access-pxk28\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974073 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-combined-ca-bundle\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974096 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c9006da-cf02-4bed-8247-02b6e929ff98-logs\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974131 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc0eca45-f776-4d77-8589-a2605f824696-logs\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974150 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-combined-ca-bundle\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974172 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-config-data-custom\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974206 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-config-data\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974241 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-config-data\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974280 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4s5h\" (UniqueName: \"kubernetes.io/projected/bc0eca45-f776-4d77-8589-a2605f824696-kube-api-access-j4s5h\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.974889 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc0eca45-f776-4d77-8589-a2605f824696-logs\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.978520 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c9006da-cf02-4bed-8247-02b6e929ff98-logs\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.982051 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-config-data-custom\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.985723 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-combined-ca-bundle\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.986527 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-config-data\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:08 crc kubenswrapper[4956]: I0930 05:46:08.988457 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c9006da-cf02-4bed-8247-02b6e929ff98-combined-ca-bundle\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:08.989191 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-config-data\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:08.991657 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc0eca45-f776-4d77-8589-a2605f824696-config-data-custom\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:08.999557 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-549c96b4c7-k89tn"] Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.009153 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4s5h\" (UniqueName: \"kubernetes.io/projected/bc0eca45-f776-4d77-8589-a2605f824696-kube-api-access-j4s5h\") pod \"barbican-keystone-listener-96cd79c6-2b9nr\" (UID: \"bc0eca45-f776-4d77-8589-a2605f824696\") " pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.010289 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.041109 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxk28\" (UniqueName: \"kubernetes.io/projected/6c9006da-cf02-4bed-8247-02b6e929ff98-kube-api-access-pxk28\") pod \"barbican-worker-56bb449d8f-bh9xp\" (UID: \"6c9006da-cf02-4bed-8247-02b6e929ff98\") " pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.043243 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549c96b4c7-k89tn"] Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.087231 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxtt\" (UniqueName: \"kubernetes.io/projected/a86bde89-5d3b-4fc0-932e-8e8638208a03-kube-api-access-5cxtt\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.087323 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-nb\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.087380 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-config\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.087411 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-swift-storage-0\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.087442 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-svc\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.087487 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-sb\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.150521 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.200672 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56bb449d8f-bh9xp" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.201385 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-config\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.201447 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-swift-storage-0\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.201486 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-svc\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.201537 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-sb\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.201569 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxtt\" (UniqueName: \"kubernetes.io/projected/a86bde89-5d3b-4fc0-932e-8e8638208a03-kube-api-access-5cxtt\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.201600 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-nb\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.202497 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-nb\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.203023 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-sb\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.203192 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-config\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.204156 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-svc\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.204195 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-swift-storage-0\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.219587 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-574fdff478-trbvv"] Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.221336 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.224935 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.236714 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxtt\" (UniqueName: \"kubernetes.io/projected/a86bde89-5d3b-4fc0-932e-8e8638208a03-kube-api-access-5cxtt\") pod \"dnsmasq-dns-549c96b4c7-k89tn\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.243641 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-574fdff478-trbvv"] Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.309494 4956 generic.go:334] "Generic (PLEG): container finished" podID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerID="c8b29d95cf0db0cde67d5a4275b60e218d247a669e802878cb96c0a7ca4fafa3" exitCode=2 Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.309549 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f89721a-6d95-46fd-9a8f-d701ccda87b8","Type":"ContainerDied","Data":"c8b29d95cf0db0cde67d5a4275b60e218d247a669e802878cb96c0a7ca4fafa3"} Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.406203 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-combined-ca-bundle\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.406521 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x2vv\" (UniqueName: \"kubernetes.io/projected/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-kube-api-access-6x2vv\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.406581 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.406624 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-logs\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.406670 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data-custom\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.452984 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.508517 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data-custom\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.508709 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-combined-ca-bundle\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.508869 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x2vv\" (UniqueName: \"kubernetes.io/projected/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-kube-api-access-6x2vv\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.508905 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.508935 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-logs\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.509576 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-logs\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.512990 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data-custom\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.514840 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-combined-ca-bundle\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.517269 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.538841 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x2vv\" (UniqueName: \"kubernetes.io/projected/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-kube-api-access-6x2vv\") pod \"barbican-api-574fdff478-trbvv\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:09 crc kubenswrapper[4956]: I0930 05:46:09.584304 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.322081 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hvzfv" event={"ID":"d7d95941-d95f-4302-84f1-9230a7b9001f","Type":"ContainerDied","Data":"c897cbf71fa3097200e5f5e44fe8e696d2f565ca20b26a8dd66967e0517d986b"} Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.322552 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c897cbf71fa3097200e5f5e44fe8e696d2f565ca20b26a8dd66967e0517d986b" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.568953 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.741943 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-config\") pod \"d7d95941-d95f-4302-84f1-9230a7b9001f\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.742351 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx8hf\" (UniqueName: \"kubernetes.io/projected/d7d95941-d95f-4302-84f1-9230a7b9001f-kube-api-access-fx8hf\") pod \"d7d95941-d95f-4302-84f1-9230a7b9001f\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.742509 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-combined-ca-bundle\") pod \"d7d95941-d95f-4302-84f1-9230a7b9001f\" (UID: \"d7d95941-d95f-4302-84f1-9230a7b9001f\") " Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.750080 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d95941-d95f-4302-84f1-9230a7b9001f-kube-api-access-fx8hf" (OuterVolumeSpecName: "kube-api-access-fx8hf") pod "d7d95941-d95f-4302-84f1-9230a7b9001f" (UID: "d7d95941-d95f-4302-84f1-9230a7b9001f"). InnerVolumeSpecName "kube-api-access-fx8hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.785130 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-config" (OuterVolumeSpecName: "config") pod "d7d95941-d95f-4302-84f1-9230a7b9001f" (UID: "d7d95941-d95f-4302-84f1-9230a7b9001f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.791293 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7d95941-d95f-4302-84f1-9230a7b9001f" (UID: "d7d95941-d95f-4302-84f1-9230a7b9001f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.844594 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.844646 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7d95941-d95f-4302-84f1-9230a7b9001f-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.844661 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx8hf\" (UniqueName: \"kubernetes.io/projected/d7d95941-d95f-4302-84f1-9230a7b9001f-kube-api-access-fx8hf\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.892995 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-574fdff478-trbvv"] Sep 30 05:46:10 crc kubenswrapper[4956]: W0930 05:46:10.896916 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bc192d7_07d9_4224_a9dc_b2a160c9c81a.slice/crio-fb166dd9918f8032ece76cdd54908865d9c2480069f50ec16f34c35202547edd WatchSource:0}: Error finding container fb166dd9918f8032ece76cdd54908865d9c2480069f50ec16f34c35202547edd: Status 404 returned error can't find the container with id fb166dd9918f8032ece76cdd54908865d9c2480069f50ec16f34c35202547edd Sep 30 05:46:10 crc kubenswrapper[4956]: I0930 05:46:10.993966 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549c96b4c7-k89tn"] Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.008790 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56bb449d8f-bh9xp"] Sep 30 05:46:11 crc kubenswrapper[4956]: E0930 05:46:11.166019 4956 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current" Sep 30 05:46:11 crc kubenswrapper[4956]: E0930 05:46:11.166066 4956 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current" Sep 30 05:46:11 crc kubenswrapper[4956]: E0930 05:46:11.166198 4956 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8xzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nnr6k_openstack(740de0e5-c3e9-43bc-bb01-8c240f50070e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 05:46:11 crc kubenswrapper[4956]: E0930 05:46:11.167381 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nnr6k" podUID="740de0e5-c3e9-43bc-bb01-8c240f50070e" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.251898 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-96cd79c6-2b9nr"] Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.344738 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" event={"ID":"bc0eca45-f776-4d77-8589-a2605f824696","Type":"ContainerStarted","Data":"090891d42dc8373a74b2eaebbff366ad5c8e6a5adb80107d776ba80c61c6540b"} Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.362182 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" event={"ID":"a86bde89-5d3b-4fc0-932e-8e8638208a03","Type":"ContainerStarted","Data":"408dbb4c646b61104d48cd33c18d4c3d0c393d609d2a7b25c01ef16a65ce3ddf"} Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.368748 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aad83b1a-8ea2-4f43-a6e3-b8e844a65115","Type":"ContainerStarted","Data":"4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f"} Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.375033 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56bb449d8f-bh9xp" event={"ID":"6c9006da-cf02-4bed-8247-02b6e929ff98","Type":"ContainerStarted","Data":"11bba153ba279affe950998230e2f289552726a38dafa4343ccc578a3d90efb7"} Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.384184 4956 generic.go:334] "Generic (PLEG): container finished" podID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerID="b305e374a7105939e3230a705ca3319a1e63c3c43ed3f80a90c520632d8e2ec8" exitCode=137 Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.384216 4956 generic.go:334] "Generic (PLEG): container finished" podID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerID="64f339e43dae649aa08f6d71106076c98f8eb556995a4093bf8a403dd78559d3" exitCode=137 Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.384258 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dfd95655c-tvpq8" event={"ID":"b25f900c-e486-4143-8ef9-0416286dc2dc","Type":"ContainerDied","Data":"b305e374a7105939e3230a705ca3319a1e63c3c43ed3f80a90c520632d8e2ec8"} Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.384281 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dfd95655c-tvpq8" event={"ID":"b25f900c-e486-4143-8ef9-0416286dc2dc","Type":"ContainerDied","Data":"64f339e43dae649aa08f6d71106076c98f8eb556995a4093bf8a403dd78559d3"} Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.412747 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574fdff478-trbvv" event={"ID":"5bc192d7-07d9-4224-a9dc-b2a160c9c81a","Type":"ContainerStarted","Data":"fb166dd9918f8032ece76cdd54908865d9c2480069f50ec16f34c35202547edd"} Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.412830 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hvzfv" Sep 30 05:46:11 crc kubenswrapper[4956]: E0930 05:46:11.419448 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current\\\"\"" pod="openstack/cinder-db-sync-nnr6k" podUID="740de0e5-c3e9-43bc-bb01-8c240f50070e" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.721029 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.808224 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549c96b4c7-k89tn"] Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.865205 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-fppc2"] Sep 30 05:46:11 crc kubenswrapper[4956]: E0930 05:46:11.865734 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerName="horizon-log" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.865747 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerName="horizon-log" Sep 30 05:46:11 crc kubenswrapper[4956]: E0930 05:46:11.865768 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d95941-d95f-4302-84f1-9230a7b9001f" containerName="neutron-db-sync" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.865773 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d95941-d95f-4302-84f1-9230a7b9001f" containerName="neutron-db-sync" Sep 30 05:46:11 crc kubenswrapper[4956]: E0930 05:46:11.865785 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerName="horizon" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.865791 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerName="horizon" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.865997 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerName="horizon" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.866022 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d95941-d95f-4302-84f1-9230a7b9001f" containerName="neutron-db-sync" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.866035 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" containerName="horizon-log" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.866028 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-config-data\") pod \"b25f900c-e486-4143-8ef9-0416286dc2dc\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.867725 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25f900c-e486-4143-8ef9-0416286dc2dc-logs\") pod \"b25f900c-e486-4143-8ef9-0416286dc2dc\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.867804 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-scripts\") pod \"b25f900c-e486-4143-8ef9-0416286dc2dc\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.867892 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd9fj\" (UniqueName: \"kubernetes.io/projected/b25f900c-e486-4143-8ef9-0416286dc2dc-kube-api-access-hd9fj\") pod \"b25f900c-e486-4143-8ef9-0416286dc2dc\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.867918 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b25f900c-e486-4143-8ef9-0416286dc2dc-horizon-secret-key\") pod \"b25f900c-e486-4143-8ef9-0416286dc2dc\" (UID: \"b25f900c-e486-4143-8ef9-0416286dc2dc\") " Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.869786 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.870037 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b25f900c-e486-4143-8ef9-0416286dc2dc-logs" (OuterVolumeSpecName: "logs") pod "b25f900c-e486-4143-8ef9-0416286dc2dc" (UID: "b25f900c-e486-4143-8ef9-0416286dc2dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.877568 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b25f900c-e486-4143-8ef9-0416286dc2dc-kube-api-access-hd9fj" (OuterVolumeSpecName: "kube-api-access-hd9fj") pod "b25f900c-e486-4143-8ef9-0416286dc2dc" (UID: "b25f900c-e486-4143-8ef9-0416286dc2dc"). InnerVolumeSpecName "kube-api-access-hd9fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.878674 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b25f900c-e486-4143-8ef9-0416286dc2dc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b25f900c-e486-4143-8ef9-0416286dc2dc" (UID: "b25f900c-e486-4143-8ef9-0416286dc2dc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.888943 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-fppc2"] Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.949397 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-scripts" (OuterVolumeSpecName: "scripts") pod "b25f900c-e486-4143-8ef9-0416286dc2dc" (UID: "b25f900c-e486-4143-8ef9-0416286dc2dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.952339 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-config-data" (OuterVolumeSpecName: "config-data") pod "b25f900c-e486-4143-8ef9-0416286dc2dc" (UID: "b25f900c-e486-4143-8ef9-0416286dc2dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.974800 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-sb\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.975570 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ll9f\" (UniqueName: \"kubernetes.io/projected/87cb9729-9cb1-4377-834e-b6e5c11e39a3-kube-api-access-5ll9f\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.975671 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-svc\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.975797 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-swift-storage-0\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.975866 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-config\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.975941 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-nb\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.976155 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25f900c-e486-4143-8ef9-0416286dc2dc-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.976215 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.976266 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd9fj\" (UniqueName: \"kubernetes.io/projected/b25f900c-e486-4143-8ef9-0416286dc2dc-kube-api-access-hd9fj\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.976318 4956 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b25f900c-e486-4143-8ef9-0416286dc2dc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:11 crc kubenswrapper[4956]: I0930 05:46:11.976384 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b25f900c-e486-4143-8ef9-0416286dc2dc-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.067339 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-646f99fb9d-9lthp"] Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.069151 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.075251 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.079687 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.083229 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-sb\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.083293 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ll9f\" (UniqueName: \"kubernetes.io/projected/87cb9729-9cb1-4377-834e-b6e5c11e39a3-kube-api-access-5ll9f\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.083365 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-svc\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.083475 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-swift-storage-0\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.083496 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-config\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.083515 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-nb\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.084679 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-nb\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.086527 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-config\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.086591 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-sb\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.086669 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-svc\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.088565 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-swift-storage-0\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.098576 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fc5c89796-dkbm8"] Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.114361 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.118808 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2xwn5" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.119558 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.119920 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.120236 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.131246 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ll9f\" (UniqueName: \"kubernetes.io/projected/87cb9729-9cb1-4377-834e-b6e5c11e39a3-kube-api-access-5ll9f\") pod \"dnsmasq-dns-84c68846bf-fppc2\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.153638 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-646f99fb9d-9lthp"] Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.205162 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzv88\" (UniqueName: \"kubernetes.io/projected/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-kube-api-access-rzv88\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.205223 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-logs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.205461 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-config-data-custom\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.205484 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-public-tls-certs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.205546 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-internal-tls-certs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.210030 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc5c89796-dkbm8"] Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.211742 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-combined-ca-bundle\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.213460 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-config-data\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.242626 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315316 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-httpd-config\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315392 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-ovndb-tls-certs\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315516 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-config-data\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315547 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-combined-ca-bundle\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315621 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzv88\" (UniqueName: \"kubernetes.io/projected/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-kube-api-access-rzv88\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315662 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlwqp\" (UniqueName: \"kubernetes.io/projected/ecc33e8d-e6f1-40aa-893c-b84853695537-kube-api-access-nlwqp\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315705 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-logs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315740 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-config-data-custom\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315767 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-public-tls-certs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315845 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-internal-tls-certs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315921 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-config\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.315949 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-combined-ca-bundle\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.319966 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-logs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.336808 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-config-data-custom\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.337486 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-config-data\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.341472 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-combined-ca-bundle\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.343603 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-public-tls-certs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.353339 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzv88\" (UniqueName: \"kubernetes.io/projected/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-kube-api-access-rzv88\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.354866 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3d2ce-91f3-420a-b8a2-ebeb4c113565-internal-tls-certs\") pod \"barbican-api-646f99fb9d-9lthp\" (UID: \"45e3d2ce-91f3-420a-b8a2-ebeb4c113565\") " pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.404039 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.419826 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-config\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.419905 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-httpd-config\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.419924 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-ovndb-tls-certs\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.419968 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-combined-ca-bundle\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.420012 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlwqp\" (UniqueName: \"kubernetes.io/projected/ecc33e8d-e6f1-40aa-893c-b84853695537-kube-api-access-nlwqp\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.439320 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-httpd-config\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.439680 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-combined-ca-bundle\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.443882 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-ovndb-tls-certs\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.458910 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlwqp\" (UniqueName: \"kubernetes.io/projected/ecc33e8d-e6f1-40aa-893c-b84853695537-kube-api-access-nlwqp\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.470927 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-config\") pod \"neutron-fc5c89796-dkbm8\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.521539 4956 generic.go:334] "Generic (PLEG): container finished" podID="a86bde89-5d3b-4fc0-932e-8e8638208a03" containerID="7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99" exitCode=0 Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.521614 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" event={"ID":"a86bde89-5d3b-4fc0-932e-8e8638208a03","Type":"ContainerDied","Data":"7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99"} Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.549255 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574fdff478-trbvv" event={"ID":"5bc192d7-07d9-4224-a9dc-b2a160c9c81a","Type":"ContainerStarted","Data":"00d21ce2570d50ab021776408cbad2f074a32d9fc947a029a70dea9260a6c1fb"} Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.549578 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574fdff478-trbvv" event={"ID":"5bc192d7-07d9-4224-a9dc-b2a160c9c81a","Type":"ContainerStarted","Data":"30b57311abaacd79a790ede4294ca375e4e289e726c3c3ba8a865bde408d2eca"} Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.549617 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.549634 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.557396 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dfd95655c-tvpq8" event={"ID":"b25f900c-e486-4143-8ef9-0416286dc2dc","Type":"ContainerDied","Data":"549b9489b9bbd1a79bdd9e8278a48c3d1701339a39bc09cc9e90153c53228ea5"} Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.557471 4956 scope.go:117] "RemoveContainer" containerID="b305e374a7105939e3230a705ca3319a1e63c3c43ed3f80a90c520632d8e2ec8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.557676 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dfd95655c-tvpq8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.603717 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-574fdff478-trbvv" podStartSLOduration=3.603687311 podStartE2EDuration="3.603687311s" podCreationTimestamp="2025-09-30 05:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:12.583204258 +0000 UTC m=+1042.910324813" watchObservedRunningTime="2025-09-30 05:46:12.603687311 +0000 UTC m=+1042.930807836" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.652618 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dfd95655c-tvpq8"] Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.664740 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-dfd95655c-tvpq8"] Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.759383 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:12 crc kubenswrapper[4956]: I0930 05:46:12.911726 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-fppc2"] Sep 30 05:46:13 crc kubenswrapper[4956]: W0930 05:46:13.163074 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87cb9729_9cb1_4377_834e_b6e5c11e39a3.slice/crio-5d0712f4e33630600f07994ecf835c08f67614486b6042d5077d6b31f0ad9039 WatchSource:0}: Error finding container 5d0712f4e33630600f07994ecf835c08f67614486b6042d5077d6b31f0ad9039: Status 404 returned error can't find the container with id 5d0712f4e33630600f07994ecf835c08f67614486b6042d5077d6b31f0ad9039 Sep 30 05:46:13 crc kubenswrapper[4956]: I0930 05:46:13.571932 4956 generic.go:334] "Generic (PLEG): container finished" podID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerID="e3cfd7068ca0b94e06643d04d3ee963e83a977de8c130820e50fda945ba0a2aa" exitCode=0 Sep 30 05:46:13 crc kubenswrapper[4956]: I0930 05:46:13.571991 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f89721a-6d95-46fd-9a8f-d701ccda87b8","Type":"ContainerDied","Data":"e3cfd7068ca0b94e06643d04d3ee963e83a977de8c130820e50fda945ba0a2aa"} Sep 30 05:46:13 crc kubenswrapper[4956]: I0930 05:46:13.573812 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" event={"ID":"87cb9729-9cb1-4377-834e-b6e5c11e39a3","Type":"ContainerStarted","Data":"5d0712f4e33630600f07994ecf835c08f67614486b6042d5077d6b31f0ad9039"} Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.145960 4956 scope.go:117] "RemoveContainer" containerID="64f339e43dae649aa08f6d71106076c98f8eb556995a4093bf8a403dd78559d3" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.353054 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b25f900c-e486-4143-8ef9-0416286dc2dc" path="/var/lib/kubelet/pods/b25f900c-e486-4143-8ef9-0416286dc2dc/volumes" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.415741 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-646f99fb9d-9lthp"] Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.438249 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.568493 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-scripts\") pod \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.568610 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-sg-core-conf-yaml\") pod \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.568686 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-config-data\") pod \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.568735 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-log-httpd\") pod \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.568809 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-combined-ca-bundle\") pod \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.568858 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjjmh\" (UniqueName: \"kubernetes.io/projected/5f89721a-6d95-46fd-9a8f-d701ccda87b8-kube-api-access-bjjmh\") pod \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.568975 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-run-httpd\") pod \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\" (UID: \"5f89721a-6d95-46fd-9a8f-d701ccda87b8\") " Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.569965 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5f89721a-6d95-46fd-9a8f-d701ccda87b8" (UID: "5f89721a-6d95-46fd-9a8f-d701ccda87b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.574565 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5f89721a-6d95-46fd-9a8f-d701ccda87b8" (UID: "5f89721a-6d95-46fd-9a8f-d701ccda87b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.583881 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-646f99fb9d-9lthp" event={"ID":"45e3d2ce-91f3-420a-b8a2-ebeb4c113565","Type":"ContainerStarted","Data":"af6fae9fa88c7e1d2181078b41fd9c2c2bae271589bb447a65dfabb0a92cee63"} Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.592215 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f89721a-6d95-46fd-9a8f-d701ccda87b8","Type":"ContainerDied","Data":"b14b7ec64210ac7f9e7bca12e2321c4072a34a021029e24f5487421e26d9caad"} Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.592277 4956 scope.go:117] "RemoveContainer" containerID="c8b29d95cf0db0cde67d5a4275b60e218d247a669e802878cb96c0a7ca4fafa3" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.587208 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-scripts" (OuterVolumeSpecName: "scripts") pod "5f89721a-6d95-46fd-9a8f-d701ccda87b8" (UID: "5f89721a-6d95-46fd-9a8f-d701ccda87b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.590091 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.587890 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f89721a-6d95-46fd-9a8f-d701ccda87b8-kube-api-access-bjjmh" (OuterVolumeSpecName: "kube-api-access-bjjmh") pod "5f89721a-6d95-46fd-9a8f-d701ccda87b8" (UID: "5f89721a-6d95-46fd-9a8f-d701ccda87b8"). InnerVolumeSpecName "kube-api-access-bjjmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.623281 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f89721a-6d95-46fd-9a8f-d701ccda87b8" (UID: "5f89721a-6d95-46fd-9a8f-d701ccda87b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.639240 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-config-data" (OuterVolumeSpecName: "config-data") pod "5f89721a-6d95-46fd-9a8f-d701ccda87b8" (UID: "5f89721a-6d95-46fd-9a8f-d701ccda87b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.671697 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.671730 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjjmh\" (UniqueName: \"kubernetes.io/projected/5f89721a-6d95-46fd-9a8f-d701ccda87b8-kube-api-access-bjjmh\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.671742 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.671760 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.671773 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.671782 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f89721a-6d95-46fd-9a8f-d701ccda87b8-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.678428 4956 scope.go:117] "RemoveContainer" containerID="e3cfd7068ca0b94e06643d04d3ee963e83a977de8c130820e50fda945ba0a2aa" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.700070 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5f89721a-6d95-46fd-9a8f-d701ccda87b8" (UID: "5f89721a-6d95-46fd-9a8f-d701ccda87b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.775266 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f89721a-6d95-46fd-9a8f-d701ccda87b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.790389 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc5c89796-dkbm8"] Sep 30 05:46:14 crc kubenswrapper[4956]: W0930 05:46:14.814366 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecc33e8d_e6f1_40aa_893c_b84853695537.slice/crio-e0699261d38288b4952fcf1e23908c74257c44aab61b25cd673db74cd624f039 WatchSource:0}: Error finding container e0699261d38288b4952fcf1e23908c74257c44aab61b25cd673db74cd624f039: Status 404 returned error can't find the container with id e0699261d38288b4952fcf1e23908c74257c44aab61b25cd673db74cd624f039 Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.965730 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:14 crc kubenswrapper[4956]: I0930 05:46:14.994920 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.010395 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:15 crc kubenswrapper[4956]: E0930 05:46:15.010831 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerName="sg-core" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.010847 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerName="sg-core" Sep 30 05:46:15 crc kubenswrapper[4956]: E0930 05:46:15.010860 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerName="ceilometer-notification-agent" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.010866 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerName="ceilometer-notification-agent" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.011054 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerName="ceilometer-notification-agent" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.011087 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" containerName="sg-core" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.016052 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.020403 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.022222 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.049450 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.186293 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4dzk\" (UniqueName: \"kubernetes.io/projected/e9465cbc-c93e-4082-8ec8-fdd77d710210-kube-api-access-z4dzk\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.186384 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-config-data\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.186437 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-scripts\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.186496 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.186565 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-run-httpd\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.186583 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.186711 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-log-httpd\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.289136 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4dzk\" (UniqueName: \"kubernetes.io/projected/e9465cbc-c93e-4082-8ec8-fdd77d710210-kube-api-access-z4dzk\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.289209 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-config-data\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.289262 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-scripts\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.289304 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.289369 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-run-httpd\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.289384 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.289467 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-log-httpd\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.289904 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-run-httpd\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.290061 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-log-httpd\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.294877 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-config-data\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.297107 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.300341 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.304335 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-scripts\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.306826 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4dzk\" (UniqueName: \"kubernetes.io/projected/e9465cbc-c93e-4082-8ec8-fdd77d710210-kube-api-access-z4dzk\") pod \"ceilometer-0\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.337209 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.607221 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc5c89796-dkbm8" event={"ID":"ecc33e8d-e6f1-40aa-893c-b84853695537","Type":"ContainerStarted","Data":"e0699261d38288b4952fcf1e23908c74257c44aab61b25cd673db74cd624f039"} Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.697224 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cbdcdc45c-b9667"] Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.699032 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.702732 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.703156 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.724567 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cbdcdc45c-b9667"] Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.805004 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.823318 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-httpd-config\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.823383 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdbl\" (UniqueName: \"kubernetes.io/projected/0a1e613a-d4fe-4779-97f9-a931d68f083c-kube-api-access-4mdbl\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.823414 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-internal-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.823498 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-public-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.823567 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-config\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.823595 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-combined-ca-bundle\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.823628 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-ovndb-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.925283 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-httpd-config\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.925338 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdbl\" (UniqueName: \"kubernetes.io/projected/0a1e613a-d4fe-4779-97f9-a931d68f083c-kube-api-access-4mdbl\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.925363 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-internal-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.925434 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-public-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.925471 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-config\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.925487 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-combined-ca-bundle\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.925511 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-ovndb-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.931364 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-httpd-config\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.931439 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-config\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.931736 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-ovndb-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.932747 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-combined-ca-bundle\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.934532 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-public-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.934662 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a1e613a-d4fe-4779-97f9-a931d68f083c-internal-tls-certs\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:15 crc kubenswrapper[4956]: I0930 05:46:15.944838 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdbl\" (UniqueName: \"kubernetes.io/projected/0a1e613a-d4fe-4779-97f9-a931d68f083c-kube-api-access-4mdbl\") pod \"neutron-cbdcdc45c-b9667\" (UID: \"0a1e613a-d4fe-4779-97f9-a931d68f083c\") " pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.036222 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:16 crc kubenswrapper[4956]: E0930 05:46:16.226919 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaad83b1a_8ea2_4f43_a6e3_b8e844a65115.slice/crio-conmon-4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f.scope\": RecentStats: unable to find data in memory cache]" Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.355488 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f89721a-6d95-46fd-9a8f-d701ccda87b8" path="/var/lib/kubelet/pods/5f89721a-6d95-46fd-9a8f-d701ccda87b8/volumes" Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.621140 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerStarted","Data":"7c86f0ef629175f0a102d4f3095b041c0faf6127f45335d08a416a60cc9647ab"} Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.623197 4956 generic.go:334] "Generic (PLEG): container finished" podID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" containerID="4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f" exitCode=1 Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.623290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aad83b1a-8ea2-4f43-a6e3-b8e844a65115","Type":"ContainerDied","Data":"4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f"} Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.623361 4956 scope.go:117] "RemoveContainer" containerID="2fa0d064ca663afac0ef8c15ff4e0f9d4ad6547c7ed2d0d784b1c82d476b062e" Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.624438 4956 scope.go:117] "RemoveContainer" containerID="4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f" Sep 30 05:46:16 crc kubenswrapper[4956]: E0930 05:46:16.624720 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aad83b1a-8ea2-4f43-a6e3-b8e844a65115)\"" pod="openstack/watcher-decision-engine-0" podUID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.624760 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc5c89796-dkbm8" event={"ID":"ecc33e8d-e6f1-40aa-893c-b84853695537","Type":"ContainerStarted","Data":"66dcc3d9abfa9b844f72e1c370dd183b80de9d10201c7a69b2ae21b199ef8cf1"} Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.634756 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cbdcdc45c-b9667"] Sep 30 05:46:16 crc kubenswrapper[4956]: I0930 05:46:16.784687 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68c96b554d-8mtnt" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.637813 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc5c89796-dkbm8" event={"ID":"ecc33e8d-e6f1-40aa-893c-b84853695537","Type":"ContainerStarted","Data":"8d5cff6e3b9c3fc6b37aa69fc684da082e2ba0692515596e61bff0abbc889b8f"} Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.638213 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.640280 4956 generic.go:334] "Generic (PLEG): container finished" podID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" containerID="627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1" exitCode=0 Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.640378 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" event={"ID":"87cb9729-9cb1-4377-834e-b6e5c11e39a3","Type":"ContainerDied","Data":"627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1"} Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.642284 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56bb449d8f-bh9xp" event={"ID":"6c9006da-cf02-4bed-8247-02b6e929ff98","Type":"ContainerStarted","Data":"7ff5c1da65828e45584d99b989d972a32c28b1fa86229960aa081f2cf4107ad1"} Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.651770 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-646f99fb9d-9lthp" event={"ID":"45e3d2ce-91f3-420a-b8a2-ebeb4c113565","Type":"ContainerStarted","Data":"7de7046edafa1fc54f9a004cae945abb0986a60b9ec4aaf2cef3f3a48aeb6adb"} Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.661066 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" event={"ID":"a86bde89-5d3b-4fc0-932e-8e8638208a03","Type":"ContainerStarted","Data":"8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd"} Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.661354 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" podUID="a86bde89-5d3b-4fc0-932e-8e8638208a03" containerName="dnsmasq-dns" containerID="cri-o://8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd" gracePeriod=10 Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.661630 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.661615 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fc5c89796-dkbm8" podStartSLOduration=5.661591819 podStartE2EDuration="5.661591819s" podCreationTimestamp="2025-09-30 05:46:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:17.656979204 +0000 UTC m=+1047.984099729" watchObservedRunningTime="2025-09-30 05:46:17.661591819 +0000 UTC m=+1047.988712364" Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.666022 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cbdcdc45c-b9667" event={"ID":"0a1e613a-d4fe-4779-97f9-a931d68f083c","Type":"ContainerStarted","Data":"142baab5922ffb148ae322263c5d067f7f0653e621732c74e3799f7b199fc05d"} Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.666089 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cbdcdc45c-b9667" event={"ID":"0a1e613a-d4fe-4779-97f9-a931d68f083c","Type":"ContainerStarted","Data":"e82f4f47b83c01a53295eb747d0deee025c33e8f6360f4ee47dc83154c562d3e"} Sep 30 05:46:17 crc kubenswrapper[4956]: I0930 05:46:17.718780 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" podStartSLOduration=9.718760086 podStartE2EDuration="9.718760086s" podCreationTimestamp="2025-09-30 05:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:17.704842379 +0000 UTC m=+1048.031962914" watchObservedRunningTime="2025-09-30 05:46:17.718760086 +0000 UTC m=+1048.045880611" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.076258 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.076693 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.246860 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.388711 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-svc\") pod \"a86bde89-5d3b-4fc0-932e-8e8638208a03\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.388811 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-config\") pod \"a86bde89-5d3b-4fc0-932e-8e8638208a03\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.388874 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cxtt\" (UniqueName: \"kubernetes.io/projected/a86bde89-5d3b-4fc0-932e-8e8638208a03-kube-api-access-5cxtt\") pod \"a86bde89-5d3b-4fc0-932e-8e8638208a03\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.388933 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-swift-storage-0\") pod \"a86bde89-5d3b-4fc0-932e-8e8638208a03\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.388963 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-nb\") pod \"a86bde89-5d3b-4fc0-932e-8e8638208a03\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.388986 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-sb\") pod \"a86bde89-5d3b-4fc0-932e-8e8638208a03\" (UID: \"a86bde89-5d3b-4fc0-932e-8e8638208a03\") " Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.407860 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86bde89-5d3b-4fc0-932e-8e8638208a03-kube-api-access-5cxtt" (OuterVolumeSpecName: "kube-api-access-5cxtt") pod "a86bde89-5d3b-4fc0-932e-8e8638208a03" (UID: "a86bde89-5d3b-4fc0-932e-8e8638208a03"). InnerVolumeSpecName "kube-api-access-5cxtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.472899 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a86bde89-5d3b-4fc0-932e-8e8638208a03" (UID: "a86bde89-5d3b-4fc0-932e-8e8638208a03"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.475798 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a86bde89-5d3b-4fc0-932e-8e8638208a03" (UID: "a86bde89-5d3b-4fc0-932e-8e8638208a03"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.480503 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a86bde89-5d3b-4fc0-932e-8e8638208a03" (UID: "a86bde89-5d3b-4fc0-932e-8e8638208a03"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.493804 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cxtt\" (UniqueName: \"kubernetes.io/projected/a86bde89-5d3b-4fc0-932e-8e8638208a03-kube-api-access-5cxtt\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.493839 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.493849 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.493886 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.502596 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a86bde89-5d3b-4fc0-932e-8e8638208a03" (UID: "a86bde89-5d3b-4fc0-932e-8e8638208a03"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.510236 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-config" (OuterVolumeSpecName: "config") pod "a86bde89-5d3b-4fc0-932e-8e8638208a03" (UID: "a86bde89-5d3b-4fc0-932e-8e8638208a03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.597830 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.597859 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a86bde89-5d3b-4fc0-932e-8e8638208a03-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.677488 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cbdcdc45c-b9667" event={"ID":"0a1e613a-d4fe-4779-97f9-a931d68f083c","Type":"ContainerStarted","Data":"64bd735d1524e2ebf5ad2c63c8510bb1310050b85f3adbadb3c8073243d6b5ad"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.677661 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.679466 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" event={"ID":"87cb9729-9cb1-4377-834e-b6e5c11e39a3","Type":"ContainerStarted","Data":"07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.680297 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.682458 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56bb449d8f-bh9xp" event={"ID":"6c9006da-cf02-4bed-8247-02b6e929ff98","Type":"ContainerStarted","Data":"2f501865ed758cc017e8454ed61e600c4ea187452f21bd94fbbfe8d93434c3d8"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.685320 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" event={"ID":"bc0eca45-f776-4d77-8589-a2605f824696","Type":"ContainerStarted","Data":"7650b1ae70d52d873cacd043c336442df9b1cfcd7c090991424d95e088d9e43d"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.685349 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" event={"ID":"bc0eca45-f776-4d77-8589-a2605f824696","Type":"ContainerStarted","Data":"bc001bb98c5abc0798c1cb3678f3322dbee908e04e7b47dcaebc46a894953bdb"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.687138 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerStarted","Data":"fe604eb6614a3fd4df304f09dc0b0457a81958897fc0590d2f97f13bafe71ca4"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.688829 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-646f99fb9d-9lthp" event={"ID":"45e3d2ce-91f3-420a-b8a2-ebeb4c113565","Type":"ContainerStarted","Data":"19d6a3f4a842de43ac8339131f1c364d4c3a1e157417ed5d9ada81b59746056a"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.689323 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.689350 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.690793 4956 generic.go:334] "Generic (PLEG): container finished" podID="a86bde89-5d3b-4fc0-932e-8e8638208a03" containerID="8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd" exitCode=0 Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.690856 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" event={"ID":"a86bde89-5d3b-4fc0-932e-8e8638208a03","Type":"ContainerDied","Data":"8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.690925 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" event={"ID":"a86bde89-5d3b-4fc0-932e-8e8638208a03","Type":"ContainerDied","Data":"408dbb4c646b61104d48cd33c18d4c3d0c393d609d2a7b25c01ef16a65ce3ddf"} Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.690946 4956 scope.go:117] "RemoveContainer" containerID="8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.690874 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549c96b4c7-k89tn" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.699760 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cbdcdc45c-b9667" podStartSLOduration=3.699745777 podStartE2EDuration="3.699745777s" podCreationTimestamp="2025-09-30 05:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:18.698044935 +0000 UTC m=+1049.025165470" watchObservedRunningTime="2025-09-30 05:46:18.699745777 +0000 UTC m=+1049.026866302" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.728779 4956 scope.go:117] "RemoveContainer" containerID="7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.731465 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56bb449d8f-bh9xp" podStartSLOduration=7.545091889 podStartE2EDuration="10.731455925s" podCreationTimestamp="2025-09-30 05:46:08 +0000 UTC" firstStartedPulling="2025-09-30 05:46:11.023806282 +0000 UTC m=+1041.350926807" lastFinishedPulling="2025-09-30 05:46:14.210170318 +0000 UTC m=+1044.537290843" observedRunningTime="2025-09-30 05:46:18.729370949 +0000 UTC m=+1049.056491474" watchObservedRunningTime="2025-09-30 05:46:18.731455925 +0000 UTC m=+1049.058576450" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.749794 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" podStartSLOduration=7.7497725299999995 podStartE2EDuration="7.74977253s" podCreationTimestamp="2025-09-30 05:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:18.747103796 +0000 UTC m=+1049.074224331" watchObservedRunningTime="2025-09-30 05:46:18.74977253 +0000 UTC m=+1049.076893045" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.769021 4956 scope.go:117] "RemoveContainer" containerID="8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.769464 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-96cd79c6-2b9nr" podStartSLOduration=4.8375709449999995 podStartE2EDuration="10.769441009s" podCreationTimestamp="2025-09-30 05:46:08 +0000 UTC" firstStartedPulling="2025-09-30 05:46:11.271127737 +0000 UTC m=+1041.598248262" lastFinishedPulling="2025-09-30 05:46:17.202997801 +0000 UTC m=+1047.530118326" observedRunningTime="2025-09-30 05:46:18.765572297 +0000 UTC m=+1049.092692822" watchObservedRunningTime="2025-09-30 05:46:18.769441009 +0000 UTC m=+1049.096561534" Sep 30 05:46:18 crc kubenswrapper[4956]: E0930 05:46:18.771922 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd\": container with ID starting with 8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd not found: ID does not exist" containerID="8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.774001 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd"} err="failed to get container status \"8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd\": rpc error: code = NotFound desc = could not find container \"8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd\": container with ID starting with 8a811462378c133cdc09c265898980962f02b05622d9e5dd18fc9abb25a303bd not found: ID does not exist" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.774033 4956 scope.go:117] "RemoveContainer" containerID="7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99" Sep 30 05:46:18 crc kubenswrapper[4956]: E0930 05:46:18.774380 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99\": container with ID starting with 7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99 not found: ID does not exist" containerID="7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.774427 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99"} err="failed to get container status \"7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99\": rpc error: code = NotFound desc = could not find container \"7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99\": container with ID starting with 7e3cd3786c5dfb579e97503f6d0a3c0c4a81a3e6dd08f6933a1ce7dd36ab7c99 not found: ID does not exist" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.838504 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-646f99fb9d-9lthp" podStartSLOduration=6.838484609 podStartE2EDuration="6.838484609s" podCreationTimestamp="2025-09-30 05:46:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:18.805604045 +0000 UTC m=+1049.132724570" watchObservedRunningTime="2025-09-30 05:46:18.838484609 +0000 UTC m=+1049.165605134" Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.842456 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549c96b4c7-k89tn"] Sep 30 05:46:18 crc kubenswrapper[4956]: I0930 05:46:18.853048 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-549c96b4c7-k89tn"] Sep 30 05:46:19 crc kubenswrapper[4956]: I0930 05:46:19.702126 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerStarted","Data":"8f8a6f9f7740abe06cc5bdd94291a38a3ad3a94cab9b7bf53c0e0f8ac5cf2df4"} Sep 30 05:46:20 crc kubenswrapper[4956]: I0930 05:46:20.355817 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a86bde89-5d3b-4fc0-932e-8e8638208a03" path="/var/lib/kubelet/pods/a86bde89-5d3b-4fc0-932e-8e8638208a03/volumes" Sep 30 05:46:20 crc kubenswrapper[4956]: I0930 05:46:20.529918 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 05:46:20 crc kubenswrapper[4956]: I0930 05:46:20.529993 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 05:46:20 crc kubenswrapper[4956]: I0930 05:46:20.530016 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 05:46:20 crc kubenswrapper[4956]: I0930 05:46:20.530038 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 05:46:20 crc kubenswrapper[4956]: I0930 05:46:20.531807 4956 scope.go:117] "RemoveContainer" containerID="4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f" Sep 30 05:46:20 crc kubenswrapper[4956]: E0930 05:46:20.541052 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aad83b1a-8ea2-4f43-a6e3-b8e844a65115)\"" pod="openstack/watcher-decision-engine-0" podUID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" Sep 30 05:46:20 crc kubenswrapper[4956]: I0930 05:46:20.714229 4956 scope.go:117] "RemoveContainer" containerID="4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f" Sep 30 05:46:20 crc kubenswrapper[4956]: E0930 05:46:20.715544 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aad83b1a-8ea2-4f43-a6e3-b8e844a65115)\"" pod="openstack/watcher-decision-engine-0" podUID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" Sep 30 05:46:21 crc kubenswrapper[4956]: I0930 05:46:21.243616 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:21 crc kubenswrapper[4956]: I0930 05:46:21.270869 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:22 crc kubenswrapper[4956]: I0930 05:46:22.245370 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:22 crc kubenswrapper[4956]: I0930 05:46:22.315165 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-vmw7n"] Sep 30 05:46:22 crc kubenswrapper[4956]: I0930 05:46:22.315423 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" podUID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerName="dnsmasq-dns" containerID="cri-o://d19aa099dc4e5112b74d9ddd6d508896e97a9418be1fc9d214a21c4c543234c7" gracePeriod=10 Sep 30 05:46:22 crc kubenswrapper[4956]: I0930 05:46:22.981468 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" podUID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Sep 30 05:46:23 crc kubenswrapper[4956]: I0930 05:46:23.709543 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:46:23 crc kubenswrapper[4956]: I0930 05:46:23.712174 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-654d9b45dd-f9lqj" Sep 30 05:46:23 crc kubenswrapper[4956]: I0930 05:46:23.829423 4956 generic.go:334] "Generic (PLEG): container finished" podID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerID="d19aa099dc4e5112b74d9ddd6d508896e97a9418be1fc9d214a21c4c543234c7" exitCode=0 Sep 30 05:46:23 crc kubenswrapper[4956]: I0930 05:46:23.830305 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" event={"ID":"b0f69bb4-0f8e-489f-9a47-06e84127473e","Type":"ContainerDied","Data":"d19aa099dc4e5112b74d9ddd6d508896e97a9418be1fc9d214a21c4c543234c7"} Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.249843 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.345134 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-sb\") pod \"b0f69bb4-0f8e-489f-9a47-06e84127473e\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.345228 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-svc\") pod \"b0f69bb4-0f8e-489f-9a47-06e84127473e\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.345508 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-config\") pod \"b0f69bb4-0f8e-489f-9a47-06e84127473e\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.345655 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-nb\") pod \"b0f69bb4-0f8e-489f-9a47-06e84127473e\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.345700 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j4bt\" (UniqueName: \"kubernetes.io/projected/b0f69bb4-0f8e-489f-9a47-06e84127473e-kube-api-access-4j4bt\") pod \"b0f69bb4-0f8e-489f-9a47-06e84127473e\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.345770 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-swift-storage-0\") pod \"b0f69bb4-0f8e-489f-9a47-06e84127473e\" (UID: \"b0f69bb4-0f8e-489f-9a47-06e84127473e\") " Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.354912 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f69bb4-0f8e-489f-9a47-06e84127473e-kube-api-access-4j4bt" (OuterVolumeSpecName: "kube-api-access-4j4bt") pod "b0f69bb4-0f8e-489f-9a47-06e84127473e" (UID: "b0f69bb4-0f8e-489f-9a47-06e84127473e"). InnerVolumeSpecName "kube-api-access-4j4bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.399139 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-config" (OuterVolumeSpecName: "config") pod "b0f69bb4-0f8e-489f-9a47-06e84127473e" (UID: "b0f69bb4-0f8e-489f-9a47-06e84127473e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.416421 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b0f69bb4-0f8e-489f-9a47-06e84127473e" (UID: "b0f69bb4-0f8e-489f-9a47-06e84127473e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.419135 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0f69bb4-0f8e-489f-9a47-06e84127473e" (UID: "b0f69bb4-0f8e-489f-9a47-06e84127473e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.420659 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0f69bb4-0f8e-489f-9a47-06e84127473e" (UID: "b0f69bb4-0f8e-489f-9a47-06e84127473e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.421610 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0f69bb4-0f8e-489f-9a47-06e84127473e" (UID: "b0f69bb4-0f8e-489f-9a47-06e84127473e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.447991 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.448049 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.448058 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.448069 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.448078 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j4bt\" (UniqueName: \"kubernetes.io/projected/b0f69bb4-0f8e-489f-9a47-06e84127473e-kube-api-access-4j4bt\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.448088 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0f69bb4-0f8e-489f-9a47-06e84127473e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.536733 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.564127 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-646f99fb9d-9lthp" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.622211 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-574fdff478-trbvv"] Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.622441 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-574fdff478-trbvv" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api-log" containerID="cri-o://30b57311abaacd79a790ede4294ca375e4e289e726c3c3ba8a865bde408d2eca" gracePeriod=30 Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.622763 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-574fdff478-trbvv" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api" containerID="cri-o://00d21ce2570d50ab021776408cbad2f074a32d9fc947a029a70dea9260a6c1fb" gracePeriod=30 Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.628247 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-574fdff478-trbvv" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": EOF" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.628280 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-574fdff478-trbvv" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": EOF" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.851408 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" event={"ID":"b0f69bb4-0f8e-489f-9a47-06e84127473e","Type":"ContainerDied","Data":"df186e38457ba1be9277aa57d6493f4a82ede481f5dcf32a015f277caeac8e37"} Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.851792 4956 scope.go:117] "RemoveContainer" containerID="d19aa099dc4e5112b74d9ddd6d508896e97a9418be1fc9d214a21c4c543234c7" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.851972 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cf77b4997-vmw7n" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.862634 4956 generic.go:334] "Generic (PLEG): container finished" podID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerID="30b57311abaacd79a790ede4294ca375e4e289e726c3c3ba8a865bde408d2eca" exitCode=143 Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.862700 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574fdff478-trbvv" event={"ID":"5bc192d7-07d9-4224-a9dc-b2a160c9c81a","Type":"ContainerDied","Data":"30b57311abaacd79a790ede4294ca375e4e289e726c3c3ba8a865bde408d2eca"} Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.886018 4956 scope.go:117] "RemoveContainer" containerID="df6cc740622ec28a7a6ace160739af8fe6b68f41199c3a33a3a5364a046cf626" Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.888024 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerStarted","Data":"77bd564eb7f6df95059f7fccc0ed74f9f114bc49b66253c257593c3cc88ec341"} Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.942243 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-vmw7n"] Sep 30 05:46:24 crc kubenswrapper[4956]: I0930 05:46:24.979182 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cf77b4997-vmw7n"] Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.480526 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6798cf9d78-m426q" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.632493 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2r7"] Sep 30 05:46:25 crc kubenswrapper[4956]: E0930 05:46:25.633173 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86bde89-5d3b-4fc0-932e-8e8638208a03" containerName="dnsmasq-dns" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.633197 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86bde89-5d3b-4fc0-932e-8e8638208a03" containerName="dnsmasq-dns" Sep 30 05:46:25 crc kubenswrapper[4956]: E0930 05:46:25.633229 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerName="dnsmasq-dns" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.633240 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerName="dnsmasq-dns" Sep 30 05:46:25 crc kubenswrapper[4956]: E0930 05:46:25.633273 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86bde89-5d3b-4fc0-932e-8e8638208a03" containerName="init" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.633279 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86bde89-5d3b-4fc0-932e-8e8638208a03" containerName="init" Sep 30 05:46:25 crc kubenswrapper[4956]: E0930 05:46:25.633293 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerName="init" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.633299 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerName="init" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.633516 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f69bb4-0f8e-489f-9a47-06e84127473e" containerName="dnsmasq-dns" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.633541 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a86bde89-5d3b-4fc0-932e-8e8638208a03" containerName="dnsmasq-dns" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.639694 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2r7"] Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.639850 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.681781 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-catalog-content\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.681836 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvqj\" (UniqueName: \"kubernetes.io/projected/31db92b4-e9cf-42e6-a531-9795e91837a2-kube-api-access-tsvqj\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.681863 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-utilities\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.784053 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-catalog-content\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.784451 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvqj\" (UniqueName: \"kubernetes.io/projected/31db92b4-e9cf-42e6-a531-9795e91837a2-kube-api-access-tsvqj\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.784479 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-utilities\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.784738 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-catalog-content\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.784941 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-utilities\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.805649 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvqj\" (UniqueName: \"kubernetes.io/projected/31db92b4-e9cf-42e6-a531-9795e91837a2-kube-api-access-tsvqj\") pod \"redhat-marketplace-zh2r7\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.825555 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h5ch9"] Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.831752 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.847192 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h5ch9"] Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.888078 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4tmq\" (UniqueName: \"kubernetes.io/projected/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-kube-api-access-d4tmq\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.888169 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-utilities\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.888359 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-catalog-content\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.971178 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerStarted","Data":"2e30f70aea206ad0d70110b426528eee91efaa80099cb2625693b0a82f9baf28"} Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.971811 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.976619 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.998788 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.697741186 podStartE2EDuration="11.998767404s" podCreationTimestamp="2025-09-30 05:46:14 +0000 UTC" firstStartedPulling="2025-09-30 05:46:15.80885949 +0000 UTC m=+1046.135980015" lastFinishedPulling="2025-09-30 05:46:25.109885708 +0000 UTC m=+1055.437006233" observedRunningTime="2025-09-30 05:46:25.998563097 +0000 UTC m=+1056.325683622" watchObservedRunningTime="2025-09-30 05:46:25.998767404 +0000 UTC m=+1056.325887939" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.999196 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4tmq\" (UniqueName: \"kubernetes.io/projected/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-kube-api-access-d4tmq\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.999258 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-utilities\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.999326 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-catalog-content\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:25 crc kubenswrapper[4956]: I0930 05:46:25.999827 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-catalog-content\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.003051 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-utilities\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.028715 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4tmq\" (UniqueName: \"kubernetes.io/projected/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-kube-api-access-d4tmq\") pod \"redhat-operators-h5ch9\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.177154 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.425471 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f69bb4-0f8e-489f-9a47-06e84127473e" path="/var/lib/kubelet/pods/b0f69bb4-0f8e-489f-9a47-06e84127473e/volumes" Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.584416 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2r7"] Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.785122 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68c96b554d-8mtnt" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.785671 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.879527 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h5ch9"] Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.985904 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nnr6k" event={"ID":"740de0e5-c3e9-43bc-bb01-8c240f50070e","Type":"ContainerStarted","Data":"3bfbff8657b38fae883e6d49f7723f7336c138325a9cb86b8eb583bcdd980336"} Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.987817 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5ch9" event={"ID":"fcdacf1b-ba86-43bc-94af-a78f1ac7146d","Type":"ContainerStarted","Data":"5c62f8e49df70caf7cc911f48d410543f2f0fdfe91265e674faaee1cc5fe8acb"} Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.992369 4956 generic.go:334] "Generic (PLEG): container finished" podID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerID="a8f3689275a8d51bfa2cf96150bad7b79f1256df8da43b7202a297e45c334057" exitCode=0 Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.992499 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2r7" event={"ID":"31db92b4-e9cf-42e6-a531-9795e91837a2","Type":"ContainerDied","Data":"a8f3689275a8d51bfa2cf96150bad7b79f1256df8da43b7202a297e45c334057"} Sep 30 05:46:26 crc kubenswrapper[4956]: I0930 05:46:26.992566 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2r7" event={"ID":"31db92b4-e9cf-42e6-a531-9795e91837a2","Type":"ContainerStarted","Data":"98226c2a35e4cf0a3fc0fb3e71fa57da6eae6e56f6c9209e00a7e9c76437f2cf"} Sep 30 05:46:27 crc kubenswrapper[4956]: I0930 05:46:27.006376 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nnr6k" podStartSLOduration=6.446122697 podStartE2EDuration="51.006340951s" podCreationTimestamp="2025-09-30 05:45:36 +0000 UTC" firstStartedPulling="2025-09-30 05:45:40.999803966 +0000 UTC m=+1011.326924491" lastFinishedPulling="2025-09-30 05:46:25.56002222 +0000 UTC m=+1055.887142745" observedRunningTime="2025-09-30 05:46:27.002903884 +0000 UTC m=+1057.330024409" watchObservedRunningTime="2025-09-30 05:46:27.006340951 +0000 UTC m=+1057.333461476" Sep 30 05:46:28 crc kubenswrapper[4956]: I0930 05:46:28.006870 4956 generic.go:334] "Generic (PLEG): container finished" podID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerID="1659362fde65409e4e06a6a1a58955816ae7c5b328d89e16568fd871f11ce1f9" exitCode=0 Sep 30 05:46:28 crc kubenswrapper[4956]: I0930 05:46:28.007011 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2r7" event={"ID":"31db92b4-e9cf-42e6-a531-9795e91837a2","Type":"ContainerDied","Data":"1659362fde65409e4e06a6a1a58955816ae7c5b328d89e16568fd871f11ce1f9"} Sep 30 05:46:28 crc kubenswrapper[4956]: I0930 05:46:28.012030 4956 generic.go:334] "Generic (PLEG): container finished" podID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerID="0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60" exitCode=0 Sep 30 05:46:28 crc kubenswrapper[4956]: I0930 05:46:28.012093 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5ch9" event={"ID":"fcdacf1b-ba86-43bc-94af-a78f1ac7146d","Type":"ContainerDied","Data":"0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60"} Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.036980 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2r7" event={"ID":"31db92b4-e9cf-42e6-a531-9795e91837a2","Type":"ContainerStarted","Data":"de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2"} Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.041823 4956 generic.go:334] "Generic (PLEG): container finished" podID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerID="00d21ce2570d50ab021776408cbad2f074a32d9fc947a029a70dea9260a6c1fb" exitCode=0 Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.041853 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574fdff478-trbvv" event={"ID":"5bc192d7-07d9-4224-a9dc-b2a160c9c81a","Type":"ContainerDied","Data":"00d21ce2570d50ab021776408cbad2f074a32d9fc947a029a70dea9260a6c1fb"} Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.056458 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zh2r7" podStartSLOduration=2.62110384 podStartE2EDuration="4.056440706s" podCreationTimestamp="2025-09-30 05:46:25 +0000 UTC" firstStartedPulling="2025-09-30 05:46:26.993650952 +0000 UTC m=+1057.320771477" lastFinishedPulling="2025-09-30 05:46:28.428987828 +0000 UTC m=+1058.756108343" observedRunningTime="2025-09-30 05:46:29.055691501 +0000 UTC m=+1059.382812026" watchObservedRunningTime="2025-09-30 05:46:29.056440706 +0000 UTC m=+1059.383561231" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.176807 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.286342 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x2vv\" (UniqueName: \"kubernetes.io/projected/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-kube-api-access-6x2vv\") pod \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.286415 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data-custom\") pod \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.286503 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-logs\") pod \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.286554 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-combined-ca-bundle\") pod \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.286574 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data\") pod \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\" (UID: \"5bc192d7-07d9-4224-a9dc-b2a160c9c81a\") " Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.287403 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-logs" (OuterVolumeSpecName: "logs") pod "5bc192d7-07d9-4224-a9dc-b2a160c9c81a" (UID: "5bc192d7-07d9-4224-a9dc-b2a160c9c81a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.297026 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-kube-api-access-6x2vv" (OuterVolumeSpecName: "kube-api-access-6x2vv") pod "5bc192d7-07d9-4224-a9dc-b2a160c9c81a" (UID: "5bc192d7-07d9-4224-a9dc-b2a160c9c81a"). InnerVolumeSpecName "kube-api-access-6x2vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.299284 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5bc192d7-07d9-4224-a9dc-b2a160c9c81a" (UID: "5bc192d7-07d9-4224-a9dc-b2a160c9c81a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.318826 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bc192d7-07d9-4224-a9dc-b2a160c9c81a" (UID: "5bc192d7-07d9-4224-a9dc-b2a160c9c81a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.354961 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data" (OuterVolumeSpecName: "config-data") pod "5bc192d7-07d9-4224-a9dc-b2a160c9c81a" (UID: "5bc192d7-07d9-4224-a9dc-b2a160c9c81a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.389198 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x2vv\" (UniqueName: \"kubernetes.io/projected/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-kube-api-access-6x2vv\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.389247 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.389259 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.389279 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:29 crc kubenswrapper[4956]: I0930 05:46:29.389290 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc192d7-07d9-4224-a9dc-b2a160c9c81a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.054540 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5ch9" event={"ID":"fcdacf1b-ba86-43bc-94af-a78f1ac7146d","Type":"ContainerStarted","Data":"29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0"} Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.056835 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574fdff478-trbvv" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.056839 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574fdff478-trbvv" event={"ID":"5bc192d7-07d9-4224-a9dc-b2a160c9c81a","Type":"ContainerDied","Data":"fb166dd9918f8032ece76cdd54908865d9c2480069f50ec16f34c35202547edd"} Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.056922 4956 scope.go:117] "RemoveContainer" containerID="00d21ce2570d50ab021776408cbad2f074a32d9fc947a029a70dea9260a6c1fb" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.100141 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-574fdff478-trbvv"] Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.100364 4956 scope.go:117] "RemoveContainer" containerID="30b57311abaacd79a790ede4294ca375e4e289e726c3c3ba8a865bde408d2eca" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.108920 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-574fdff478-trbvv"] Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.353960 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" path="/var/lib/kubelet/pods/5bc192d7-07d9-4224-a9dc-b2a160c9c81a/volumes" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.619972 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 05:46:30 crc kubenswrapper[4956]: E0930 05:46:30.620393 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.620413 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api" Sep 30 05:46:30 crc kubenswrapper[4956]: E0930 05:46:30.620423 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api-log" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.620429 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api-log" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.620636 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.620657 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc192d7-07d9-4224-a9dc-b2a160c9c81a" containerName="barbican-api-log" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.621703 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.625713 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.625785 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.626222 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-frjsk" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.635875 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.719938 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzh9g\" (UniqueName: \"kubernetes.io/projected/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-kube-api-access-vzh9g\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.720025 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-openstack-config\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.720069 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-openstack-config-secret\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.720101 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.821618 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzh9g\" (UniqueName: \"kubernetes.io/projected/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-kube-api-access-vzh9g\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.821714 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-openstack-config\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.821744 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-openstack-config-secret\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.821779 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.822828 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-openstack-config\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.827771 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-openstack-config-secret\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.830086 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:30 crc kubenswrapper[4956]: I0930 05:46:30.842646 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzh9g\" (UniqueName: \"kubernetes.io/projected/f596b95b-7b2b-4d7b-8f33-9eb214a39a21-kube-api-access-vzh9g\") pod \"openstackclient\" (UID: \"f596b95b-7b2b-4d7b-8f33-9eb214a39a21\") " pod="openstack/openstackclient" Sep 30 05:46:31 crc kubenswrapper[4956]: I0930 05:46:31.001596 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 05:46:31 crc kubenswrapper[4956]: I0930 05:46:31.071379 4956 generic.go:334] "Generic (PLEG): container finished" podID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerID="29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0" exitCode=0 Sep 30 05:46:31 crc kubenswrapper[4956]: I0930 05:46:31.071436 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5ch9" event={"ID":"fcdacf1b-ba86-43bc-94af-a78f1ac7146d","Type":"ContainerDied","Data":"29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0"} Sep 30 05:46:31 crc kubenswrapper[4956]: I0930 05:46:31.340946 4956 scope.go:117] "RemoveContainer" containerID="4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f" Sep 30 05:46:31 crc kubenswrapper[4956]: E0930 05:46:31.341620 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aad83b1a-8ea2-4f43-a6e3-b8e844a65115)\"" pod="openstack/watcher-decision-engine-0" podUID="aad83b1a-8ea2-4f43-a6e3-b8e844a65115" Sep 30 05:46:31 crc kubenswrapper[4956]: I0930 05:46:31.465770 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 05:46:31 crc kubenswrapper[4956]: W0930 05:46:31.487840 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf596b95b_7b2b_4d7b_8f33_9eb214a39a21.slice/crio-d1e7ae3c4db8a0577885545674bb03787f8e4339c7e064c3e8ab5962560f903d WatchSource:0}: Error finding container d1e7ae3c4db8a0577885545674bb03787f8e4339c7e064c3e8ab5962560f903d: Status 404 returned error can't find the container with id d1e7ae3c4db8a0577885545674bb03787f8e4339c7e064c3e8ab5962560f903d Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.107402 4956 generic.go:334] "Generic (PLEG): container finished" podID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerID="39f6fba81bf2338227b34babb54a50886f23dca04dacce224c3d4af146eaea5f" exitCode=137 Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.107454 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68c96b554d-8mtnt" event={"ID":"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95","Type":"ContainerDied","Data":"39f6fba81bf2338227b34babb54a50886f23dca04dacce224c3d4af146eaea5f"} Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.111568 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f596b95b-7b2b-4d7b-8f33-9eb214a39a21","Type":"ContainerStarted","Data":"d1e7ae3c4db8a0577885545674bb03787f8e4339c7e064c3e8ab5962560f903d"} Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.376477 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.452998 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-tls-certs\") pod \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.453140 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-config-data\") pod \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.453181 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-combined-ca-bundle\") pod \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.453267 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-logs\") pod \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.453483 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jthms\" (UniqueName: \"kubernetes.io/projected/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-kube-api-access-jthms\") pod \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.453503 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-secret-key\") pod \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.453555 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-scripts\") pod \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\" (UID: \"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95\") " Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.457598 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-logs" (OuterVolumeSpecName: "logs") pod "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" (UID: "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.462205 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-kube-api-access-jthms" (OuterVolumeSpecName: "kube-api-access-jthms") pod "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" (UID: "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95"). InnerVolumeSpecName "kube-api-access-jthms". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.462600 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" (UID: "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.493449 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-scripts" (OuterVolumeSpecName: "scripts") pod "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" (UID: "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.495918 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-config-data" (OuterVolumeSpecName: "config-data") pod "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" (UID: "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.519757 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" (UID: "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.520649 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" (UID: "bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.556481 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.556721 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jthms\" (UniqueName: \"kubernetes.io/projected/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-kube-api-access-jthms\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.556827 4956 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.556911 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.556989 4956 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.557076 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:32 crc kubenswrapper[4956]: I0930 05:46:32.557196 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:33 crc kubenswrapper[4956]: I0930 05:46:33.126397 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5ch9" event={"ID":"fcdacf1b-ba86-43bc-94af-a78f1ac7146d","Type":"ContainerStarted","Data":"ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92"} Sep 30 05:46:33 crc kubenswrapper[4956]: I0930 05:46:33.129306 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68c96b554d-8mtnt" event={"ID":"bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95","Type":"ContainerDied","Data":"ca0b91b0c9cdd07aa2a4a914b6bec221c543d307509862bc0d5da7d55f1c0020"} Sep 30 05:46:33 crc kubenswrapper[4956]: I0930 05:46:33.129349 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68c96b554d-8mtnt" Sep 30 05:46:33 crc kubenswrapper[4956]: I0930 05:46:33.129383 4956 scope.go:117] "RemoveContainer" containerID="7eafd8dbb682893d7dec3183ccf8ede2bfcfc62744d82946285977f5c6f648d8" Sep 30 05:46:33 crc kubenswrapper[4956]: I0930 05:46:33.167185 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h5ch9" podStartSLOduration=4.251371885 podStartE2EDuration="8.167158694s" podCreationTimestamp="2025-09-30 05:46:25 +0000 UTC" firstStartedPulling="2025-09-30 05:46:28.014232019 +0000 UTC m=+1058.341352544" lastFinishedPulling="2025-09-30 05:46:31.930018818 +0000 UTC m=+1062.257139353" observedRunningTime="2025-09-30 05:46:33.156189818 +0000 UTC m=+1063.483310343" watchObservedRunningTime="2025-09-30 05:46:33.167158694 +0000 UTC m=+1063.494279219" Sep 30 05:46:33 crc kubenswrapper[4956]: I0930 05:46:33.181424 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68c96b554d-8mtnt"] Sep 30 05:46:33 crc kubenswrapper[4956]: I0930 05:46:33.189994 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68c96b554d-8mtnt"] Sep 30 05:46:33 crc kubenswrapper[4956]: I0930 05:46:33.394863 4956 scope.go:117] "RemoveContainer" containerID="39f6fba81bf2338227b34babb54a50886f23dca04dacce224c3d4af146eaea5f" Sep 30 05:46:34 crc kubenswrapper[4956]: I0930 05:46:34.358367 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" path="/var/lib/kubelet/pods/bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95/volumes" Sep 30 05:46:34 crc kubenswrapper[4956]: I0930 05:46:34.915103 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:34 crc kubenswrapper[4956]: I0930 05:46:34.915862 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="ceilometer-central-agent" containerID="cri-o://fe604eb6614a3fd4df304f09dc0b0457a81958897fc0590d2f97f13bafe71ca4" gracePeriod=30 Sep 30 05:46:34 crc kubenswrapper[4956]: I0930 05:46:34.916322 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="proxy-httpd" containerID="cri-o://2e30f70aea206ad0d70110b426528eee91efaa80099cb2625693b0a82f9baf28" gracePeriod=30 Sep 30 05:46:34 crc kubenswrapper[4956]: I0930 05:46:34.916389 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="ceilometer-notification-agent" containerID="cri-o://8f8a6f9f7740abe06cc5bdd94291a38a3ad3a94cab9b7bf53c0e0f8ac5cf2df4" gracePeriod=30 Sep 30 05:46:34 crc kubenswrapper[4956]: I0930 05:46:34.916468 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="sg-core" containerID="cri-o://77bd564eb7f6df95059f7fccc0ed74f9f114bc49b66253c257593c3cc88ec341" gracePeriod=30 Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.156322 4956 generic.go:334] "Generic (PLEG): container finished" podID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerID="2e30f70aea206ad0d70110b426528eee91efaa80099cb2625693b0a82f9baf28" exitCode=0 Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.156368 4956 generic.go:334] "Generic (PLEG): container finished" podID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerID="77bd564eb7f6df95059f7fccc0ed74f9f114bc49b66253c257593c3cc88ec341" exitCode=2 Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.156392 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerDied","Data":"2e30f70aea206ad0d70110b426528eee91efaa80099cb2625693b0a82f9baf28"} Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.156458 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerDied","Data":"77bd564eb7f6df95059f7fccc0ed74f9f114bc49b66253c257593c3cc88ec341"} Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.456532 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-779b5888b9-9hp77"] Sep 30 05:46:35 crc kubenswrapper[4956]: E0930 05:46:35.457262 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.457285 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon" Sep 30 05:46:35 crc kubenswrapper[4956]: E0930 05:46:35.457321 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon-log" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.457329 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon-log" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.457577 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.457616 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9f6b53-6b3b-4e6f-a825-14e5ba80ff95" containerName="horizon-log" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.459267 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.465546 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.465734 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.465764 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.472388 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-779b5888b9-9hp77"] Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.527909 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62113d6a-1e88-402d-b6bd-4f119a6df416-log-httpd\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.527986 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62113d6a-1e88-402d-b6bd-4f119a6df416-etc-swift\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.528067 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-config-data\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.528115 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-combined-ca-bundle\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.528179 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-public-tls-certs\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.528208 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-internal-tls-certs\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.528305 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpj8x\" (UniqueName: \"kubernetes.io/projected/62113d6a-1e88-402d-b6bd-4f119a6df416-kube-api-access-zpj8x\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.528336 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62113d6a-1e88-402d-b6bd-4f119a6df416-run-httpd\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.630083 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-internal-tls-certs\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.630167 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpj8x\" (UniqueName: \"kubernetes.io/projected/62113d6a-1e88-402d-b6bd-4f119a6df416-kube-api-access-zpj8x\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.630192 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62113d6a-1e88-402d-b6bd-4f119a6df416-run-httpd\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.630252 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62113d6a-1e88-402d-b6bd-4f119a6df416-log-httpd\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.630290 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62113d6a-1e88-402d-b6bd-4f119a6df416-etc-swift\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.630357 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-config-data\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.630388 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-combined-ca-bundle\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.630412 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-public-tls-certs\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.631612 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62113d6a-1e88-402d-b6bd-4f119a6df416-log-httpd\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.631849 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62113d6a-1e88-402d-b6bd-4f119a6df416-run-httpd\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.637237 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62113d6a-1e88-402d-b6bd-4f119a6df416-etc-swift\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.641824 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-config-data\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.646151 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-combined-ca-bundle\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.647607 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-internal-tls-certs\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.652069 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62113d6a-1e88-402d-b6bd-4f119a6df416-public-tls-certs\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.653788 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpj8x\" (UniqueName: \"kubernetes.io/projected/62113d6a-1e88-402d-b6bd-4f119a6df416-kube-api-access-zpj8x\") pod \"swift-proxy-779b5888b9-9hp77\" (UID: \"62113d6a-1e88-402d-b6bd-4f119a6df416\") " pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.790082 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.978332 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:35 crc kubenswrapper[4956]: I0930 05:46:35.978905 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:36 crc kubenswrapper[4956]: I0930 05:46:36.043691 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:36 crc kubenswrapper[4956]: I0930 05:46:36.176676 4956 generic.go:334] "Generic (PLEG): container finished" podID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerID="fe604eb6614a3fd4df304f09dc0b0457a81958897fc0590d2f97f13bafe71ca4" exitCode=0 Sep 30 05:46:36 crc kubenswrapper[4956]: I0930 05:46:36.176925 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerDied","Data":"fe604eb6614a3fd4df304f09dc0b0457a81958897fc0590d2f97f13bafe71ca4"} Sep 30 05:46:36 crc kubenswrapper[4956]: I0930 05:46:36.177375 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:36 crc kubenswrapper[4956]: I0930 05:46:36.180442 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:46:36 crc kubenswrapper[4956]: I0930 05:46:36.397770 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:36 crc kubenswrapper[4956]: I0930 05:46:36.521684 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-779b5888b9-9hp77"] Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.193713 4956 generic.go:334] "Generic (PLEG): container finished" podID="740de0e5-c3e9-43bc-bb01-8c240f50070e" containerID="3bfbff8657b38fae883e6d49f7723f7336c138325a9cb86b8eb583bcdd980336" exitCode=0 Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.193831 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nnr6k" event={"ID":"740de0e5-c3e9-43bc-bb01-8c240f50070e","Type":"ContainerDied","Data":"3bfbff8657b38fae883e6d49f7723f7336c138325a9cb86b8eb583bcdd980336"} Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.201272 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-779b5888b9-9hp77" event={"ID":"62113d6a-1e88-402d-b6bd-4f119a6df416","Type":"ContainerStarted","Data":"125e3824f6efeae5403c76a5b5135365537edbbcd6fedf2515242b0b213d076c"} Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.201326 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-779b5888b9-9hp77" event={"ID":"62113d6a-1e88-402d-b6bd-4f119a6df416","Type":"ContainerStarted","Data":"b30ce1032a76816c87576228212f5ee7baaab4e1fba1c7dc8938815eb3181e97"} Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.201339 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-779b5888b9-9hp77" event={"ID":"62113d6a-1e88-402d-b6bd-4f119a6df416","Type":"ContainerStarted","Data":"fdf8bd3fc9f053a964ec09e98238a1ade5142f0d6ed57c7f0a0899e127a60923"} Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.202382 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.202518 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.246441 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-779b5888b9-9hp77" podStartSLOduration=2.2464238229999998 podStartE2EDuration="2.246423823s" podCreationTimestamp="2025-09-30 05:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:37.241790858 +0000 UTC m=+1067.568911393" watchObservedRunningTime="2025-09-30 05:46:37.246423823 +0000 UTC m=+1067.573544338" Sep 30 05:46:37 crc kubenswrapper[4956]: I0930 05:46:37.318308 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5ch9" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="registry-server" probeResult="failure" output=< Sep 30 05:46:37 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Sep 30 05:46:37 crc kubenswrapper[4956]: > Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.420233 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2r7"] Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.783646 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.901388 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8xzz\" (UniqueName: \"kubernetes.io/projected/740de0e5-c3e9-43bc-bb01-8c240f50070e-kube-api-access-f8xzz\") pod \"740de0e5-c3e9-43bc-bb01-8c240f50070e\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.901491 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/740de0e5-c3e9-43bc-bb01-8c240f50070e-etc-machine-id\") pod \"740de0e5-c3e9-43bc-bb01-8c240f50070e\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.901536 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-config-data\") pod \"740de0e5-c3e9-43bc-bb01-8c240f50070e\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.901555 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-scripts\") pod \"740de0e5-c3e9-43bc-bb01-8c240f50070e\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.901587 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-combined-ca-bundle\") pod \"740de0e5-c3e9-43bc-bb01-8c240f50070e\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.901606 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-db-sync-config-data\") pod \"740de0e5-c3e9-43bc-bb01-8c240f50070e\" (UID: \"740de0e5-c3e9-43bc-bb01-8c240f50070e\") " Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.901592 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740de0e5-c3e9-43bc-bb01-8c240f50070e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "740de0e5-c3e9-43bc-bb01-8c240f50070e" (UID: "740de0e5-c3e9-43bc-bb01-8c240f50070e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.902447 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/740de0e5-c3e9-43bc-bb01-8c240f50070e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.911640 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-scripts" (OuterVolumeSpecName: "scripts") pod "740de0e5-c3e9-43bc-bb01-8c240f50070e" (UID: "740de0e5-c3e9-43bc-bb01-8c240f50070e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.911725 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "740de0e5-c3e9-43bc-bb01-8c240f50070e" (UID: "740de0e5-c3e9-43bc-bb01-8c240f50070e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.916786 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740de0e5-c3e9-43bc-bb01-8c240f50070e-kube-api-access-f8xzz" (OuterVolumeSpecName: "kube-api-access-f8xzz") pod "740de0e5-c3e9-43bc-bb01-8c240f50070e" (UID: "740de0e5-c3e9-43bc-bb01-8c240f50070e"). InnerVolumeSpecName "kube-api-access-f8xzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.935976 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "740de0e5-c3e9-43bc-bb01-8c240f50070e" (UID: "740de0e5-c3e9-43bc-bb01-8c240f50070e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:38 crc kubenswrapper[4956]: I0930 05:46:38.976009 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-config-data" (OuterVolumeSpecName: "config-data") pod "740de0e5-c3e9-43bc-bb01-8c240f50070e" (UID: "740de0e5-c3e9-43bc-bb01-8c240f50070e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.008712 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8xzz\" (UniqueName: \"kubernetes.io/projected/740de0e5-c3e9-43bc-bb01-8c240f50070e-kube-api-access-f8xzz\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.008750 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.008760 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.008771 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.008785 4956 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/740de0e5-c3e9-43bc-bb01-8c240f50070e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.238411 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zh2r7" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="registry-server" containerID="cri-o://de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2" gracePeriod=2 Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.238858 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nnr6k" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.250312 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nnr6k" event={"ID":"740de0e5-c3e9-43bc-bb01-8c240f50070e","Type":"ContainerDied","Data":"a930588477252f5460e9259b0c01523b245f657a647303988300021b7118ec82"} Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.250351 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a930588477252f5460e9259b0c01523b245f657a647303988300021b7118ec82" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.465331 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:46:39 crc kubenswrapper[4956]: E0930 05:46:39.465949 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740de0e5-c3e9-43bc-bb01-8c240f50070e" containerName="cinder-db-sync" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.465962 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="740de0e5-c3e9-43bc-bb01-8c240f50070e" containerName="cinder-db-sync" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.475413 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="740de0e5-c3e9-43bc-bb01-8c240f50070e" containerName="cinder-db-sync" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.476556 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.488466 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9tmwg" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.494526 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.512496 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.519195 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.520528 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.620792 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.620891 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.620912 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.620953 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.621017 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.621054 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbskz\" (UniqueName: \"kubernetes.io/projected/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-kube-api-access-hbskz\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.667440 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75958fc765-6sqg8"] Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.669809 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.729481 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.729756 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.729798 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.729888 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.730057 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-6sqg8"] Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.730074 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.730259 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbskz\" (UniqueName: \"kubernetes.io/projected/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-kube-api-access-hbskz\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.730908 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.736315 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.738791 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.755885 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.778620 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.780552 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbskz\" (UniqueName: \"kubernetes.io/projected/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-kube-api-access-hbskz\") pod \"cinder-scheduler-0\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.797414 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.836210 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-config\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.836300 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-svc\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.836331 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.836358 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzl7z\" (UniqueName: \"kubernetes.io/projected/058e635e-0fa8-429a-9a36-b52a9f219228-kube-api-access-nzl7z\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.836392 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.836457 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.938223 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.938292 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-config\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.938347 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-svc\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.938369 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.938389 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzl7z\" (UniqueName: \"kubernetes.io/projected/058e635e-0fa8-429a-9a36-b52a9f219228-kube-api-access-nzl7z\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.938422 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.939942 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.939946 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.940522 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-config\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.941139 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.941684 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-svc\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:39 crc kubenswrapper[4956]: I0930 05:46:39.959880 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzl7z\" (UniqueName: \"kubernetes.io/projected/058e635e-0fa8-429a-9a36-b52a9f219228-kube-api-access-nzl7z\") pod \"dnsmasq-dns-75958fc765-6sqg8\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.014300 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.034291 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.038885 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.043307 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.055795 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data-custom\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.055858 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhzxb\" (UniqueName: \"kubernetes.io/projected/270b46ca-2932-4939-9b32-4db415367e94-kube-api-access-vhzxb\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.055923 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.055982 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/270b46ca-2932-4939-9b32-4db415367e94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.056021 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/270b46ca-2932-4939-9b32-4db415367e94-logs\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.056064 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-scripts\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.056202 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.070708 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158270 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/270b46ca-2932-4939-9b32-4db415367e94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158322 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/270b46ca-2932-4939-9b32-4db415367e94-logs\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158350 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-scripts\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158373 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158423 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/270b46ca-2932-4939-9b32-4db415367e94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158514 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data-custom\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158559 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhzxb\" (UniqueName: \"kubernetes.io/projected/270b46ca-2932-4939-9b32-4db415367e94-kube-api-access-vhzxb\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158648 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.158786 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/270b46ca-2932-4939-9b32-4db415367e94-logs\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.162312 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.164140 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.175547 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data-custom\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.198760 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-scripts\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.206788 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhzxb\" (UniqueName: \"kubernetes.io/projected/270b46ca-2932-4939-9b32-4db415367e94-kube-api-access-vhzxb\") pod \"cinder-api-0\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " pod="openstack/cinder-api-0" Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.263586 4956 generic.go:334] "Generic (PLEG): container finished" podID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerID="de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2" exitCode=0 Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.263621 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2r7" event={"ID":"31db92b4-e9cf-42e6-a531-9795e91837a2","Type":"ContainerDied","Data":"de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2"} Sep 30 05:46:40 crc kubenswrapper[4956]: I0930 05:46:40.366354 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 05:46:41 crc kubenswrapper[4956]: I0930 05:46:41.278033 4956 generic.go:334] "Generic (PLEG): container finished" podID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerID="8f8a6f9f7740abe06cc5bdd94291a38a3ad3a94cab9b7bf53c0e0f8ac5cf2df4" exitCode=0 Sep 30 05:46:41 crc kubenswrapper[4956]: I0930 05:46:41.278206 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerDied","Data":"8f8a6f9f7740abe06cc5bdd94291a38a3ad3a94cab9b7bf53c0e0f8ac5cf2df4"} Sep 30 05:46:42 crc kubenswrapper[4956]: I0930 05:46:42.463430 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:42 crc kubenswrapper[4956]: I0930 05:46:42.794831 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:43 crc kubenswrapper[4956]: I0930 05:46:43.342900 4956 scope.go:117] "RemoveContainer" containerID="4ef654140d80b3a5537f7ceebebd417fcff6cad58728141934dbf21caaecab9f" Sep 30 05:46:45 crc kubenswrapper[4956]: I0930 05:46:45.344931 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.176:3000/\": dial tcp 10.217.0.176:3000: connect: connection refused" Sep 30 05:46:45 crc kubenswrapper[4956]: I0930 05:46:45.798125 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:45 crc kubenswrapper[4956]: I0930 05:46:45.802788 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-779b5888b9-9hp77" Sep 30 05:46:45 crc kubenswrapper[4956]: E0930 05:46:45.978529 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2 is running failed: container process not found" containerID="de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 05:46:45 crc kubenswrapper[4956]: E0930 05:46:45.979009 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2 is running failed: container process not found" containerID="de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 05:46:45 crc kubenswrapper[4956]: E0930 05:46:45.979937 4956 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2 is running failed: container process not found" containerID="de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 05:46:45 crc kubenswrapper[4956]: E0930 05:46:45.980014 4956 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-zh2r7" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="registry-server" Sep 30 05:46:46 crc kubenswrapper[4956]: I0930 05:46:46.057673 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cbdcdc45c-b9667" Sep 30 05:46:46 crc kubenswrapper[4956]: I0930 05:46:46.122305 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fc5c89796-dkbm8"] Sep 30 05:46:46 crc kubenswrapper[4956]: I0930 05:46:46.122615 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fc5c89796-dkbm8" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerName="neutron-api" containerID="cri-o://66dcc3d9abfa9b844f72e1c370dd183b80de9d10201c7a69b2ae21b199ef8cf1" gracePeriod=30 Sep 30 05:46:46 crc kubenswrapper[4956]: I0930 05:46:46.122649 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fc5c89796-dkbm8" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerName="neutron-httpd" containerID="cri-o://8d5cff6e3b9c3fc6b37aa69fc684da082e2ba0692515596e61bff0abbc889b8f" gracePeriod=30 Sep 30 05:46:47 crc kubenswrapper[4956]: I0930 05:46:47.233465 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5ch9" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="registry-server" probeResult="failure" output=< Sep 30 05:46:47 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Sep 30 05:46:47 crc kubenswrapper[4956]: > Sep 30 05:46:47 crc kubenswrapper[4956]: I0930 05:46:47.351526 4956 generic.go:334] "Generic (PLEG): container finished" podID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerID="8d5cff6e3b9c3fc6b37aa69fc684da082e2ba0692515596e61bff0abbc889b8f" exitCode=0 Sep 30 05:46:47 crc kubenswrapper[4956]: I0930 05:46:47.351567 4956 generic.go:334] "Generic (PLEG): container finished" podID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerID="66dcc3d9abfa9b844f72e1c370dd183b80de9d10201c7a69b2ae21b199ef8cf1" exitCode=0 Sep 30 05:46:47 crc kubenswrapper[4956]: I0930 05:46:47.351591 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc5c89796-dkbm8" event={"ID":"ecc33e8d-e6f1-40aa-893c-b84853695537","Type":"ContainerDied","Data":"8d5cff6e3b9c3fc6b37aa69fc684da082e2ba0692515596e61bff0abbc889b8f"} Sep 30 05:46:47 crc kubenswrapper[4956]: I0930 05:46:47.351621 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc5c89796-dkbm8" event={"ID":"ecc33e8d-e6f1-40aa-893c-b84853695537","Type":"ContainerDied","Data":"66dcc3d9abfa9b844f72e1c370dd183b80de9d10201c7a69b2ae21b199ef8cf1"} Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.079489 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.079544 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.497838 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.640042 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-config-data\") pod \"e9465cbc-c93e-4082-8ec8-fdd77d710210\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.640206 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-sg-core-conf-yaml\") pod \"e9465cbc-c93e-4082-8ec8-fdd77d710210\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.640256 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-combined-ca-bundle\") pod \"e9465cbc-c93e-4082-8ec8-fdd77d710210\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.640317 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-scripts\") pod \"e9465cbc-c93e-4082-8ec8-fdd77d710210\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.640358 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4dzk\" (UniqueName: \"kubernetes.io/projected/e9465cbc-c93e-4082-8ec8-fdd77d710210-kube-api-access-z4dzk\") pod \"e9465cbc-c93e-4082-8ec8-fdd77d710210\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.640453 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-run-httpd\") pod \"e9465cbc-c93e-4082-8ec8-fdd77d710210\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.640482 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-log-httpd\") pod \"e9465cbc-c93e-4082-8ec8-fdd77d710210\" (UID: \"e9465cbc-c93e-4082-8ec8-fdd77d710210\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.642059 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e9465cbc-c93e-4082-8ec8-fdd77d710210" (UID: "e9465cbc-c93e-4082-8ec8-fdd77d710210"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.642265 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e9465cbc-c93e-4082-8ec8-fdd77d710210" (UID: "e9465cbc-c93e-4082-8ec8-fdd77d710210"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.646603 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9465cbc-c93e-4082-8ec8-fdd77d710210-kube-api-access-z4dzk" (OuterVolumeSpecName: "kube-api-access-z4dzk") pod "e9465cbc-c93e-4082-8ec8-fdd77d710210" (UID: "e9465cbc-c93e-4082-8ec8-fdd77d710210"). InnerVolumeSpecName "kube-api-access-z4dzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.647564 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-scripts" (OuterVolumeSpecName: "scripts") pod "e9465cbc-c93e-4082-8ec8-fdd77d710210" (UID: "e9465cbc-c93e-4082-8ec8-fdd77d710210"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.680270 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e9465cbc-c93e-4082-8ec8-fdd77d710210" (UID: "e9465cbc-c93e-4082-8ec8-fdd77d710210"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.717665 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.733738 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.744240 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.744277 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4dzk\" (UniqueName: \"kubernetes.io/projected/e9465cbc-c93e-4082-8ec8-fdd77d710210-kube-api-access-z4dzk\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.744291 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.744302 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9465cbc-c93e-4082-8ec8-fdd77d710210-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.744313 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.751319 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9465cbc-c93e-4082-8ec8-fdd77d710210" (UID: "e9465cbc-c93e-4082-8ec8-fdd77d710210"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.777004 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-config-data" (OuterVolumeSpecName: "config-data") pod "e9465cbc-c93e-4082-8ec8-fdd77d710210" (UID: "e9465cbc-c93e-4082-8ec8-fdd77d710210"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.845743 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsvqj\" (UniqueName: \"kubernetes.io/projected/31db92b4-e9cf-42e6-a531-9795e91837a2-kube-api-access-tsvqj\") pod \"31db92b4-e9cf-42e6-a531-9795e91837a2\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.845815 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-utilities\") pod \"31db92b4-e9cf-42e6-a531-9795e91837a2\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.845870 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-httpd-config\") pod \"ecc33e8d-e6f1-40aa-893c-b84853695537\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.845924 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-ovndb-tls-certs\") pod \"ecc33e8d-e6f1-40aa-893c-b84853695537\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.845963 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlwqp\" (UniqueName: \"kubernetes.io/projected/ecc33e8d-e6f1-40aa-893c-b84853695537-kube-api-access-nlwqp\") pod \"ecc33e8d-e6f1-40aa-893c-b84853695537\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.846084 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-combined-ca-bundle\") pod \"ecc33e8d-e6f1-40aa-893c-b84853695537\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.846280 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-catalog-content\") pod \"31db92b4-e9cf-42e6-a531-9795e91837a2\" (UID: \"31db92b4-e9cf-42e6-a531-9795e91837a2\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.846969 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-config\") pod \"ecc33e8d-e6f1-40aa-893c-b84853695537\" (UID: \"ecc33e8d-e6f1-40aa-893c-b84853695537\") " Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.846501 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-utilities" (OuterVolumeSpecName: "utilities") pod "31db92b4-e9cf-42e6-a531-9795e91837a2" (UID: "31db92b4-e9cf-42e6-a531-9795e91837a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.848098 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.848133 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9465cbc-c93e-4082-8ec8-fdd77d710210-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.848147 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.849999 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ecc33e8d-e6f1-40aa-893c-b84853695537" (UID: "ecc33e8d-e6f1-40aa-893c-b84853695537"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.880505 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc33e8d-e6f1-40aa-893c-b84853695537-kube-api-access-nlwqp" (OuterVolumeSpecName: "kube-api-access-nlwqp") pod "ecc33e8d-e6f1-40aa-893c-b84853695537" (UID: "ecc33e8d-e6f1-40aa-893c-b84853695537"). InnerVolumeSpecName "kube-api-access-nlwqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.880577 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31db92b4-e9cf-42e6-a531-9795e91837a2-kube-api-access-tsvqj" (OuterVolumeSpecName: "kube-api-access-tsvqj") pod "31db92b4-e9cf-42e6-a531-9795e91837a2" (UID: "31db92b4-e9cf-42e6-a531-9795e91837a2"). InnerVolumeSpecName "kube-api-access-tsvqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.881781 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31db92b4-e9cf-42e6-a531-9795e91837a2" (UID: "31db92b4-e9cf-42e6-a531-9795e91837a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: W0930 05:46:48.887783 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod270b46ca_2932_4939_9b32_4db415367e94.slice/crio-25b86b5868cdb3709a07f1a6afc885db2d734c816b7d0e58b7b9eb01edd09e89 WatchSource:0}: Error finding container 25b86b5868cdb3709a07f1a6afc885db2d734c816b7d0e58b7b9eb01edd09e89: Status 404 returned error can't find the container with id 25b86b5868cdb3709a07f1a6afc885db2d734c816b7d0e58b7b9eb01edd09e89 Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.892383 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.911172 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.912650 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-config" (OuterVolumeSpecName: "config") pod "ecc33e8d-e6f1-40aa-893c-b84853695537" (UID: "ecc33e8d-e6f1-40aa-893c-b84853695537"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.936029 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecc33e8d-e6f1-40aa-893c-b84853695537" (UID: "ecc33e8d-e6f1-40aa-893c-b84853695537"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.951502 4956 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.951591 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlwqp\" (UniqueName: \"kubernetes.io/projected/ecc33e8d-e6f1-40aa-893c-b84853695537-kube-api-access-nlwqp\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.951674 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.951688 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31db92b4-e9cf-42e6-a531-9795e91837a2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.951700 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.951711 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsvqj\" (UniqueName: \"kubernetes.io/projected/31db92b4-e9cf-42e6-a531-9795e91837a2-kube-api-access-tsvqj\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:48 crc kubenswrapper[4956]: I0930 05:46:48.981221 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ecc33e8d-e6f1-40aa-893c-b84853695537" (UID: "ecc33e8d-e6f1-40aa-893c-b84853695537"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.054675 4956 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc33e8d-e6f1-40aa-893c-b84853695537-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.078527 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-6sqg8"] Sep 30 05:46:49 crc kubenswrapper[4956]: W0930 05:46:49.089669 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod058e635e_0fa8_429a_9a36_b52a9f219228.slice/crio-b681bca102512e48d5482d73101083864a0d9f7da333489b0d824268a37ef3d6 WatchSource:0}: Error finding container b681bca102512e48d5482d73101083864a0d9f7da333489b0d824268a37ef3d6: Status 404 returned error can't find the container with id b681bca102512e48d5482d73101083864a0d9f7da333489b0d824268a37ef3d6 Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.393179 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc5c89796-dkbm8" event={"ID":"ecc33e8d-e6f1-40aa-893c-b84853695537","Type":"ContainerDied","Data":"e0699261d38288b4952fcf1e23908c74257c44aab61b25cd673db74cd624f039"} Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.393437 4956 scope.go:117] "RemoveContainer" containerID="8d5cff6e3b9c3fc6b37aa69fc684da082e2ba0692515596e61bff0abbc889b8f" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.393224 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc5c89796-dkbm8" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.396876 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aad83b1a-8ea2-4f43-a6e3-b8e844a65115","Type":"ContainerStarted","Data":"b7bb5fcf1a41cca0b0c54d61a951edf4eee6ba1cdeb409a13a82947abc1fdaa3"} Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.402106 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd011aa6-7a4c-44b4-8b90-c1e07e06f779","Type":"ContainerStarted","Data":"41ee0311c70da3cdbe14c259fb3586f8ec7bf0a790d93a31b99a663e55d0faa1"} Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.406700 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2r7" event={"ID":"31db92b4-e9cf-42e6-a531-9795e91837a2","Type":"ContainerDied","Data":"98226c2a35e4cf0a3fc0fb3e71fa57da6eae6e56f6c9209e00a7e9c76437f2cf"} Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.406746 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh2r7" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.418523 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9465cbc-c93e-4082-8ec8-fdd77d710210","Type":"ContainerDied","Data":"7c86f0ef629175f0a102d4f3095b041c0faf6127f45335d08a416a60cc9647ab"} Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.418658 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.422793 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f596b95b-7b2b-4d7b-8f33-9eb214a39a21","Type":"ContainerStarted","Data":"ecfca67d788f77d2ae16ceb6645ef945c19fd6f9d5fb34f3ec90e595589b5914"} Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.428613 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"270b46ca-2932-4939-9b32-4db415367e94","Type":"ContainerStarted","Data":"25b86b5868cdb3709a07f1a6afc885db2d734c816b7d0e58b7b9eb01edd09e89"} Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.438017 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" event={"ID":"058e635e-0fa8-429a-9a36-b52a9f219228","Type":"ContainerStarted","Data":"b681bca102512e48d5482d73101083864a0d9f7da333489b0d824268a37ef3d6"} Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.448339 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fc5c89796-dkbm8"] Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.464093 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fc5c89796-dkbm8"] Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.471270 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.768877232 podStartE2EDuration="19.471255044s" podCreationTimestamp="2025-09-30 05:46:30 +0000 UTC" firstStartedPulling="2025-09-30 05:46:31.490065017 +0000 UTC m=+1061.817185542" lastFinishedPulling="2025-09-30 05:46:48.192442829 +0000 UTC m=+1078.519563354" observedRunningTime="2025-09-30 05:46:49.458136841 +0000 UTC m=+1079.785257386" watchObservedRunningTime="2025-09-30 05:46:49.471255044 +0000 UTC m=+1079.798375569" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.494615 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.498455 4956 scope.go:117] "RemoveContainer" containerID="66dcc3d9abfa9b844f72e1c370dd183b80de9d10201c7a69b2ae21b199ef8cf1" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.527042 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.542820 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2r7"] Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.559029 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2r7"] Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.574296 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575553 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="sg-core" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575575 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="sg-core" Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575620 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerName="neutron-httpd" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575628 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerName="neutron-httpd" Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575644 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="proxy-httpd" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575651 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="proxy-httpd" Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575663 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="ceilometer-central-agent" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575670 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="ceilometer-central-agent" Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575686 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="registry-server" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575693 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="registry-server" Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575701 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerName="neutron-api" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575706 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerName="neutron-api" Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575721 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="extract-utilities" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575727 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="extract-utilities" Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575742 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="ceilometer-notification-agent" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575748 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="ceilometer-notification-agent" Sep 30 05:46:49 crc kubenswrapper[4956]: E0930 05:46:49.575754 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="extract-content" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575760 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="extract-content" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575944 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" containerName="registry-server" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575962 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="ceilometer-notification-agent" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575973 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerName="neutron-httpd" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575985 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="sg-core" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.575994 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="proxy-httpd" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.576002 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" containerName="ceilometer-central-agent" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.576012 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" containerName="neutron-api" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.582092 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.582210 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.584712 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.584903 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.590517 4956 scope.go:117] "RemoveContainer" containerID="de18a41f9c74fb2ad0dbd5cc40953a0d3985d1a84dc5b6082d66d79ba087f2d2" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.625280 4956 scope.go:117] "RemoveContainer" containerID="1659362fde65409e4e06a6a1a58955816ae7c5b328d89e16568fd871f11ce1f9" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.662883 4956 scope.go:117] "RemoveContainer" containerID="a8f3689275a8d51bfa2cf96150bad7b79f1256df8da43b7202a297e45c334057" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.685866 4956 scope.go:117] "RemoveContainer" containerID="2e30f70aea206ad0d70110b426528eee91efaa80099cb2625693b0a82f9baf28" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.721021 4956 scope.go:117] "RemoveContainer" containerID="77bd564eb7f6df95059f7fccc0ed74f9f114bc49b66253c257593c3cc88ec341" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.763607 4956 scope.go:117] "RemoveContainer" containerID="8f8a6f9f7740abe06cc5bdd94291a38a3ad3a94cab9b7bf53c0e0f8ac5cf2df4" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.775315 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-scripts\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.775430 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-run-httpd\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.775469 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.775517 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.775546 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-config-data\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.775620 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-log-httpd\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.775661 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgwvs\" (UniqueName: \"kubernetes.io/projected/47db223c-b39e-48be-adbe-d50da747580a-kube-api-access-rgwvs\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.800090 4956 scope.go:117] "RemoveContainer" containerID="fe604eb6614a3fd4df304f09dc0b0457a81958897fc0590d2f97f13bafe71ca4" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.877879 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-scripts\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.877964 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-run-httpd\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.877993 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.878022 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.878038 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-config-data\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.878091 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-log-httpd\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.878706 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-run-httpd\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.879043 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-log-httpd\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.879285 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgwvs\" (UniqueName: \"kubernetes.io/projected/47db223c-b39e-48be-adbe-d50da747580a-kube-api-access-rgwvs\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.885025 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.885676 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-config-data\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.886149 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.887566 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-scripts\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.894416 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgwvs\" (UniqueName: \"kubernetes.io/projected/47db223c-b39e-48be-adbe-d50da747580a-kube-api-access-rgwvs\") pod \"ceilometer-0\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " pod="openstack/ceilometer-0" Sep 30 05:46:49 crc kubenswrapper[4956]: I0930 05:46:49.942456 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.358276 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31db92b4-e9cf-42e6-a531-9795e91837a2" path="/var/lib/kubelet/pods/31db92b4-e9cf-42e6-a531-9795e91837a2/volumes" Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.365376 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9465cbc-c93e-4082-8ec8-fdd77d710210" path="/var/lib/kubelet/pods/e9465cbc-c93e-4082-8ec8-fdd77d710210/volumes" Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.369767 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc33e8d-e6f1-40aa-893c-b84853695537" path="/var/lib/kubelet/pods/ecc33e8d-e6f1-40aa-893c-b84853695537/volumes" Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.450910 4956 generic.go:334] "Generic (PLEG): container finished" podID="058e635e-0fa8-429a-9a36-b52a9f219228" containerID="d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3" exitCode=0 Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.451020 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" event={"ID":"058e635e-0fa8-429a-9a36-b52a9f219228","Type":"ContainerDied","Data":"d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3"} Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.457770 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd011aa6-7a4c-44b4-8b90-c1e07e06f779","Type":"ContainerStarted","Data":"978fae24e59f3d8ac596b4f0a4f7fa1b511b92e045f3136efaabf8b5cda33852"} Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.463098 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"270b46ca-2932-4939-9b32-4db415367e94","Type":"ContainerStarted","Data":"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f"} Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.533768 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.533807 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.562031 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:50 crc kubenswrapper[4956]: W0930 05:46:50.606060 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47db223c_b39e_48be_adbe_d50da747580a.slice/crio-6d40c0b3c14ec84aba5e1290c081a648fd6e8ab4fd50aab203a7363a206bfe95 WatchSource:0}: Error finding container 6d40c0b3c14ec84aba5e1290c081a648fd6e8ab4fd50aab203a7363a206bfe95: Status 404 returned error can't find the container with id 6d40c0b3c14ec84aba5e1290c081a648fd6e8ab4fd50aab203a7363a206bfe95 Sep 30 05:46:50 crc kubenswrapper[4956]: I0930 05:46:50.626610 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.482469 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" event={"ID":"058e635e-0fa8-429a-9a36-b52a9f219228","Type":"ContainerStarted","Data":"70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94"} Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.482789 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.484148 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerStarted","Data":"f1f6bae6d9337fae2dc8e2358a784aeae30bcadafb8b882179d4e55da638473f"} Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.484206 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerStarted","Data":"6d40c0b3c14ec84aba5e1290c081a648fd6e8ab4fd50aab203a7363a206bfe95"} Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.487991 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd011aa6-7a4c-44b4-8b90-c1e07e06f779","Type":"ContainerStarted","Data":"1dba6ae6b9516d9532b383ef5fd837429a2882087fa8f797c670af9b211b4362"} Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.490261 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="270b46ca-2932-4939-9b32-4db415367e94" containerName="cinder-api-log" containerID="cri-o://4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f" gracePeriod=30 Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.490589 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"270b46ca-2932-4939-9b32-4db415367e94","Type":"ContainerStarted","Data":"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec"} Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.490643 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.490674 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="270b46ca-2932-4939-9b32-4db415367e94" containerName="cinder-api" containerID="cri-o://ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec" gracePeriod=30 Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.508149 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" podStartSLOduration=12.508130861 podStartE2EDuration="12.508130861s" podCreationTimestamp="2025-09-30 05:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:51.501679219 +0000 UTC m=+1081.828799744" watchObservedRunningTime="2025-09-30 05:46:51.508130861 +0000 UTC m=+1081.835251406" Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.529107 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=11.52908538 podStartE2EDuration="11.52908538s" podCreationTimestamp="2025-09-30 05:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:51.520996886 +0000 UTC m=+1081.848117421" watchObservedRunningTime="2025-09-30 05:46:51.52908538 +0000 UTC m=+1081.856205905" Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.541092 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 05:46:51 crc kubenswrapper[4956]: I0930 05:46:51.568597 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=12.17032198 podStartE2EDuration="12.568578132s" podCreationTimestamp="2025-09-30 05:46:39 +0000 UTC" firstStartedPulling="2025-09-30 05:46:48.924419641 +0000 UTC m=+1079.251540166" lastFinishedPulling="2025-09-30 05:46:49.322675793 +0000 UTC m=+1079.649796318" observedRunningTime="2025-09-30 05:46:51.549470961 +0000 UTC m=+1081.876591486" watchObservedRunningTime="2025-09-30 05:46:51.568578132 +0000 UTC m=+1081.895698657" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.348534 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429428 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-combined-ca-bundle\") pod \"270b46ca-2932-4939-9b32-4db415367e94\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429505 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/270b46ca-2932-4939-9b32-4db415367e94-logs\") pod \"270b46ca-2932-4939-9b32-4db415367e94\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429549 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data-custom\") pod \"270b46ca-2932-4939-9b32-4db415367e94\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429575 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhzxb\" (UniqueName: \"kubernetes.io/projected/270b46ca-2932-4939-9b32-4db415367e94-kube-api-access-vhzxb\") pod \"270b46ca-2932-4939-9b32-4db415367e94\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429601 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data\") pod \"270b46ca-2932-4939-9b32-4db415367e94\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429721 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/270b46ca-2932-4939-9b32-4db415367e94-etc-machine-id\") pod \"270b46ca-2932-4939-9b32-4db415367e94\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429845 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-scripts\") pod \"270b46ca-2932-4939-9b32-4db415367e94\" (UID: \"270b46ca-2932-4939-9b32-4db415367e94\") " Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429971 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/270b46ca-2932-4939-9b32-4db415367e94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "270b46ca-2932-4939-9b32-4db415367e94" (UID: "270b46ca-2932-4939-9b32-4db415367e94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.429990 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/270b46ca-2932-4939-9b32-4db415367e94-logs" (OuterVolumeSpecName: "logs") pod "270b46ca-2932-4939-9b32-4db415367e94" (UID: "270b46ca-2932-4939-9b32-4db415367e94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.430476 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/270b46ca-2932-4939-9b32-4db415367e94-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.430501 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/270b46ca-2932-4939-9b32-4db415367e94-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.444297 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270b46ca-2932-4939-9b32-4db415367e94-kube-api-access-vhzxb" (OuterVolumeSpecName: "kube-api-access-vhzxb") pod "270b46ca-2932-4939-9b32-4db415367e94" (UID: "270b46ca-2932-4939-9b32-4db415367e94"). InnerVolumeSpecName "kube-api-access-vhzxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.447108 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-scripts" (OuterVolumeSpecName: "scripts") pod "270b46ca-2932-4939-9b32-4db415367e94" (UID: "270b46ca-2932-4939-9b32-4db415367e94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.466297 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "270b46ca-2932-4939-9b32-4db415367e94" (UID: "270b46ca-2932-4939-9b32-4db415367e94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.491648 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "270b46ca-2932-4939-9b32-4db415367e94" (UID: "270b46ca-2932-4939-9b32-4db415367e94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.499603 4956 generic.go:334] "Generic (PLEG): container finished" podID="270b46ca-2932-4939-9b32-4db415367e94" containerID="ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec" exitCode=0 Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.499642 4956 generic.go:334] "Generic (PLEG): container finished" podID="270b46ca-2932-4939-9b32-4db415367e94" containerID="4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f" exitCode=143 Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.499681 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"270b46ca-2932-4939-9b32-4db415367e94","Type":"ContainerDied","Data":"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec"} Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.499709 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"270b46ca-2932-4939-9b32-4db415367e94","Type":"ContainerDied","Data":"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f"} Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.499718 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"270b46ca-2932-4939-9b32-4db415367e94","Type":"ContainerDied","Data":"25b86b5868cdb3709a07f1a6afc885db2d734c816b7d0e58b7b9eb01edd09e89"} Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.499733 4956 scope.go:117] "RemoveContainer" containerID="ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.499849 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.503192 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerStarted","Data":"ffd6a03764ccdec20a44f2b1a83894845d8faf098dfb15ac8bec62346120fe0c"} Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.531832 4956 scope.go:117] "RemoveContainer" containerID="4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.533014 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.533040 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.533049 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.533059 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhzxb\" (UniqueName: \"kubernetes.io/projected/270b46ca-2932-4939-9b32-4db415367e94-kube-api-access-vhzxb\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.557216 4956 scope.go:117] "RemoveContainer" containerID="ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec" Sep 30 05:46:52 crc kubenswrapper[4956]: E0930 05:46:52.557923 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec\": container with ID starting with ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec not found: ID does not exist" containerID="ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.557947 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec"} err="failed to get container status \"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec\": rpc error: code = NotFound desc = could not find container \"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec\": container with ID starting with ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec not found: ID does not exist" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.557966 4956 scope.go:117] "RemoveContainer" containerID="4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f" Sep 30 05:46:52 crc kubenswrapper[4956]: E0930 05:46:52.558146 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f\": container with ID starting with 4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f not found: ID does not exist" containerID="4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.558167 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f"} err="failed to get container status \"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f\": rpc error: code = NotFound desc = could not find container \"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f\": container with ID starting with 4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f not found: ID does not exist" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.558179 4956 scope.go:117] "RemoveContainer" containerID="ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.558332 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec"} err="failed to get container status \"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec\": rpc error: code = NotFound desc = could not find container \"ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec\": container with ID starting with ce389776a7c73b453fdf6107d396b5d44e68386c1d7ffc92131a817d717049ec not found: ID does not exist" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.558352 4956 scope.go:117] "RemoveContainer" containerID="4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.558654 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f"} err="failed to get container status \"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f\": rpc error: code = NotFound desc = could not find container \"4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f\": container with ID starting with 4d2706ce4b529a30041cb3e27ff9e03a616b1b7abe9abb73b59ab92bacc33e0f not found: ID does not exist" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.559201 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data" (OuterVolumeSpecName: "config-data") pod "270b46ca-2932-4939-9b32-4db415367e94" (UID: "270b46ca-2932-4939-9b32-4db415367e94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.635046 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/270b46ca-2932-4939-9b32-4db415367e94-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.840973 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.856792 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.873656 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:52 crc kubenswrapper[4956]: E0930 05:46:52.874344 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270b46ca-2932-4939-9b32-4db415367e94" containerName="cinder-api" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.874358 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="270b46ca-2932-4939-9b32-4db415367e94" containerName="cinder-api" Sep 30 05:46:52 crc kubenswrapper[4956]: E0930 05:46:52.874374 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270b46ca-2932-4939-9b32-4db415367e94" containerName="cinder-api-log" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.874379 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="270b46ca-2932-4939-9b32-4db415367e94" containerName="cinder-api-log" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.874552 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="270b46ca-2932-4939-9b32-4db415367e94" containerName="cinder-api" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.874577 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="270b46ca-2932-4939-9b32-4db415367e94" containerName="cinder-api-log" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.875533 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.878655 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.879032 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.882874 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 05:46:52 crc kubenswrapper[4956]: I0930 05:46:52.904368 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048395 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x9q9\" (UniqueName: \"kubernetes.io/projected/3b0e586a-4f48-4d87-9ecb-732f5723e089-kube-api-access-9x9q9\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048463 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048520 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b0e586a-4f48-4d87-9ecb-732f5723e089-logs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048612 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048641 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-scripts\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048679 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048806 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b0e586a-4f48-4d87-9ecb-732f5723e089-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048869 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-config-data\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.048897 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.150739 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.150807 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b0e586a-4f48-4d87-9ecb-732f5723e089-logs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.150865 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.150889 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-scripts\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.150913 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.150936 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b0e586a-4f48-4d87-9ecb-732f5723e089-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.150953 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-config-data\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.150968 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.151035 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x9q9\" (UniqueName: \"kubernetes.io/projected/3b0e586a-4f48-4d87-9ecb-732f5723e089-kube-api-access-9x9q9\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.153047 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b0e586a-4f48-4d87-9ecb-732f5723e089-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.154049 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b0e586a-4f48-4d87-9ecb-732f5723e089-logs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.155596 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.156331 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.156939 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-scripts\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.159248 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.162123 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-config-data\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.167844 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b0e586a-4f48-4d87-9ecb-732f5723e089-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.176771 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x9q9\" (UniqueName: \"kubernetes.io/projected/3b0e586a-4f48-4d87-9ecb-732f5723e089-kube-api-access-9x9q9\") pod \"cinder-api-0\" (UID: \"3b0e586a-4f48-4d87-9ecb-732f5723e089\") " pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.204587 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.691342 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerStarted","Data":"96aef16228e08aee0b033086705676cc2afd3dd394bb8187b322cf0ce56a077b"} Sep 30 05:46:53 crc kubenswrapper[4956]: I0930 05:46:53.726233 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 05:46:54 crc kubenswrapper[4956]: I0930 05:46:54.351208 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270b46ca-2932-4939-9b32-4db415367e94" path="/var/lib/kubelet/pods/270b46ca-2932-4939-9b32-4db415367e94/volumes" Sep 30 05:46:54 crc kubenswrapper[4956]: I0930 05:46:54.707166 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b0e586a-4f48-4d87-9ecb-732f5723e089","Type":"ContainerStarted","Data":"c9424c904599784e3d90bd6ef15f03f92ed73c4341a221b2446bbc7d7ae1512c"} Sep 30 05:46:54 crc kubenswrapper[4956]: I0930 05:46:54.707543 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b0e586a-4f48-4d87-9ecb-732f5723e089","Type":"ContainerStarted","Data":"e4bdd5f52a08c7c33a36c34f63d7c215afa5dbf5a1bb62edd6ef416f7830127f"} Sep 30 05:46:54 crc kubenswrapper[4956]: I0930 05:46:54.713960 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerStarted","Data":"ba77419992750666eff2d90eb215a47f9f74e46273c1571d9d4bcc648f0a0029"} Sep 30 05:46:54 crc kubenswrapper[4956]: I0930 05:46:54.717207 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 05:46:54 crc kubenswrapper[4956]: I0930 05:46:54.748946 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.510426524 podStartE2EDuration="5.74892751s" podCreationTimestamp="2025-09-30 05:46:49 +0000 UTC" firstStartedPulling="2025-09-30 05:46:50.61491279 +0000 UTC m=+1080.942033315" lastFinishedPulling="2025-09-30 05:46:53.853413776 +0000 UTC m=+1084.180534301" observedRunningTime="2025-09-30 05:46:54.736024974 +0000 UTC m=+1085.063145509" watchObservedRunningTime="2025-09-30 05:46:54.74892751 +0000 UTC m=+1085.076048035" Sep 30 05:46:54 crc kubenswrapper[4956]: I0930 05:46:54.798448 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.017374 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.073737 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-fppc2"] Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.074379 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" podUID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" containerName="dnsmasq-dns" containerID="cri-o://07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270" gracePeriod=10 Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.204966 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.693869 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.726703 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b0e586a-4f48-4d87-9ecb-732f5723e089","Type":"ContainerStarted","Data":"a54253259f34a3ba471f2d1eb752f06c0cc1ba118590085bf5aa0bd8aa932def"} Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.726795 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.728650 4956 generic.go:334] "Generic (PLEG): container finished" podID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" containerID="07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270" exitCode=0 Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.729268 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.729423 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" event={"ID":"87cb9729-9cb1-4377-834e-b6e5c11e39a3","Type":"ContainerDied","Data":"07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270"} Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.729449 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c68846bf-fppc2" event={"ID":"87cb9729-9cb1-4377-834e-b6e5c11e39a3","Type":"ContainerDied","Data":"5d0712f4e33630600f07994ecf835c08f67614486b6042d5077d6b31f0ad9039"} Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.729467 4956 scope.go:117] "RemoveContainer" containerID="07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.756302 4956 scope.go:117] "RemoveContainer" containerID="627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.763015 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.762998482 podStartE2EDuration="3.762998482s" podCreationTimestamp="2025-09-30 05:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:46:55.756736044 +0000 UTC m=+1086.083856559" watchObservedRunningTime="2025-09-30 05:46:55.762998482 +0000 UTC m=+1086.090119007" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.789364 4956 scope.go:117] "RemoveContainer" containerID="07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270" Sep 30 05:46:55 crc kubenswrapper[4956]: E0930 05:46:55.789919 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270\": container with ID starting with 07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270 not found: ID does not exist" containerID="07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.789965 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270"} err="failed to get container status \"07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270\": rpc error: code = NotFound desc = could not find container \"07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270\": container with ID starting with 07c4ef64c83a00ce0691447c81e48566071cd0237c20596b763716e7dbdd1270 not found: ID does not exist" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.789990 4956 scope.go:117] "RemoveContainer" containerID="627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1" Sep 30 05:46:55 crc kubenswrapper[4956]: E0930 05:46:55.793469 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1\": container with ID starting with 627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1 not found: ID does not exist" containerID="627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.793586 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1"} err="failed to get container status \"627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1\": rpc error: code = NotFound desc = could not find container \"627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1\": container with ID starting with 627c5233fd1951147a9f45a24876d048b39fffc8d598ba290be1bfec539e15b1 not found: ID does not exist" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.825801 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-sb\") pod \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.825836 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ll9f\" (UniqueName: \"kubernetes.io/projected/87cb9729-9cb1-4377-834e-b6e5c11e39a3-kube-api-access-5ll9f\") pod \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.825858 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-swift-storage-0\") pod \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.825923 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-nb\") pod \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.826005 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-svc\") pod \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.826061 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-config\") pod \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\" (UID: \"87cb9729-9cb1-4377-834e-b6e5c11e39a3\") " Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.845411 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cb9729-9cb1-4377-834e-b6e5c11e39a3-kube-api-access-5ll9f" (OuterVolumeSpecName: "kube-api-access-5ll9f") pod "87cb9729-9cb1-4377-834e-b6e5c11e39a3" (UID: "87cb9729-9cb1-4377-834e-b6e5c11e39a3"). InnerVolumeSpecName "kube-api-access-5ll9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.900237 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87cb9729-9cb1-4377-834e-b6e5c11e39a3" (UID: "87cb9729-9cb1-4377-834e-b6e5c11e39a3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.904187 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87cb9729-9cb1-4377-834e-b6e5c11e39a3" (UID: "87cb9729-9cb1-4377-834e-b6e5c11e39a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.924185 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87cb9729-9cb1-4377-834e-b6e5c11e39a3" (UID: "87cb9729-9cb1-4377-834e-b6e5c11e39a3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.925156 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "87cb9729-9cb1-4377-834e-b6e5c11e39a3" (UID: "87cb9729-9cb1-4377-834e-b6e5c11e39a3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.933641 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.933757 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ll9f\" (UniqueName: \"kubernetes.io/projected/87cb9729-9cb1-4377-834e-b6e5c11e39a3-kube-api-access-5ll9f\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.933908 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.933971 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.934033 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:55 crc kubenswrapper[4956]: I0930 05:46:55.937898 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-config" (OuterVolumeSpecName: "config") pod "87cb9729-9cb1-4377-834e-b6e5c11e39a3" (UID: "87cb9729-9cb1-4377-834e-b6e5c11e39a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:46:56 crc kubenswrapper[4956]: I0930 05:46:56.035078 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87cb9729-9cb1-4377-834e-b6e5c11e39a3-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:56 crc kubenswrapper[4956]: I0930 05:46:56.059992 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-fppc2"] Sep 30 05:46:56 crc kubenswrapper[4956]: I0930 05:46:56.067830 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84c68846bf-fppc2"] Sep 30 05:46:56 crc kubenswrapper[4956]: I0930 05:46:56.354160 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" path="/var/lib/kubelet/pods/87cb9729-9cb1-4377-834e-b6e5c11e39a3/volumes" Sep 30 05:46:56 crc kubenswrapper[4956]: I0930 05:46:56.744109 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="ceilometer-central-agent" containerID="cri-o://f1f6bae6d9337fae2dc8e2358a784aeae30bcadafb8b882179d4e55da638473f" gracePeriod=30 Sep 30 05:46:56 crc kubenswrapper[4956]: I0930 05:46:56.744140 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="proxy-httpd" containerID="cri-o://ba77419992750666eff2d90eb215a47f9f74e46273c1571d9d4bcc648f0a0029" gracePeriod=30 Sep 30 05:46:56 crc kubenswrapper[4956]: I0930 05:46:56.744140 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="sg-core" containerID="cri-o://96aef16228e08aee0b033086705676cc2afd3dd394bb8187b322cf0ce56a077b" gracePeriod=30 Sep 30 05:46:56 crc kubenswrapper[4956]: I0930 05:46:56.744211 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="ceilometer-notification-agent" containerID="cri-o://ffd6a03764ccdec20a44f2b1a83894845d8faf098dfb15ac8bec62346120fe0c" gracePeriod=30 Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.224268 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5ch9" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="registry-server" probeResult="failure" output=< Sep 30 05:46:57 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Sep 30 05:46:57 crc kubenswrapper[4956]: > Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.771703 4956 generic.go:334] "Generic (PLEG): container finished" podID="47db223c-b39e-48be-adbe-d50da747580a" containerID="ba77419992750666eff2d90eb215a47f9f74e46273c1571d9d4bcc648f0a0029" exitCode=0 Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.772069 4956 generic.go:334] "Generic (PLEG): container finished" podID="47db223c-b39e-48be-adbe-d50da747580a" containerID="96aef16228e08aee0b033086705676cc2afd3dd394bb8187b322cf0ce56a077b" exitCode=2 Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.772080 4956 generic.go:334] "Generic (PLEG): container finished" podID="47db223c-b39e-48be-adbe-d50da747580a" containerID="ffd6a03764ccdec20a44f2b1a83894845d8faf098dfb15ac8bec62346120fe0c" exitCode=0 Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.772090 4956 generic.go:334] "Generic (PLEG): container finished" podID="47db223c-b39e-48be-adbe-d50da747580a" containerID="f1f6bae6d9337fae2dc8e2358a784aeae30bcadafb8b882179d4e55da638473f" exitCode=0 Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.772129 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerDied","Data":"ba77419992750666eff2d90eb215a47f9f74e46273c1571d9d4bcc648f0a0029"} Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.772159 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerDied","Data":"96aef16228e08aee0b033086705676cc2afd3dd394bb8187b322cf0ce56a077b"} Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.772173 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerDied","Data":"ffd6a03764ccdec20a44f2b1a83894845d8faf098dfb15ac8bec62346120fe0c"} Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.772184 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerDied","Data":"f1f6bae6d9337fae2dc8e2358a784aeae30bcadafb8b882179d4e55da638473f"} Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.888573 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.888831 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerName="glance-log" containerID="cri-o://7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263" gracePeriod=30 Sep 30 05:46:57 crc kubenswrapper[4956]: I0930 05:46:57.888962 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerName="glance-httpd" containerID="cri-o://f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05" gracePeriod=30 Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.069046 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.179442 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-log-httpd\") pod \"47db223c-b39e-48be-adbe-d50da747580a\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.179493 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgwvs\" (UniqueName: \"kubernetes.io/projected/47db223c-b39e-48be-adbe-d50da747580a-kube-api-access-rgwvs\") pod \"47db223c-b39e-48be-adbe-d50da747580a\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.179522 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-sg-core-conf-yaml\") pod \"47db223c-b39e-48be-adbe-d50da747580a\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.179611 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-scripts\") pod \"47db223c-b39e-48be-adbe-d50da747580a\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.179916 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-run-httpd\") pod \"47db223c-b39e-48be-adbe-d50da747580a\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.179962 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-config-data\") pod \"47db223c-b39e-48be-adbe-d50da747580a\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.180002 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "47db223c-b39e-48be-adbe-d50da747580a" (UID: "47db223c-b39e-48be-adbe-d50da747580a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.180401 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "47db223c-b39e-48be-adbe-d50da747580a" (UID: "47db223c-b39e-48be-adbe-d50da747580a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.180482 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-combined-ca-bundle\") pod \"47db223c-b39e-48be-adbe-d50da747580a\" (UID: \"47db223c-b39e-48be-adbe-d50da747580a\") " Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.181281 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.181299 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47db223c-b39e-48be-adbe-d50da747580a-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.184787 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-scripts" (OuterVolumeSpecName: "scripts") pod "47db223c-b39e-48be-adbe-d50da747580a" (UID: "47db223c-b39e-48be-adbe-d50da747580a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.208482 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47db223c-b39e-48be-adbe-d50da747580a-kube-api-access-rgwvs" (OuterVolumeSpecName: "kube-api-access-rgwvs") pod "47db223c-b39e-48be-adbe-d50da747580a" (UID: "47db223c-b39e-48be-adbe-d50da747580a"). InnerVolumeSpecName "kube-api-access-rgwvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.218329 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "47db223c-b39e-48be-adbe-d50da747580a" (UID: "47db223c-b39e-48be-adbe-d50da747580a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.285628 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.285887 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgwvs\" (UniqueName: \"kubernetes.io/projected/47db223c-b39e-48be-adbe-d50da747580a-kube-api-access-rgwvs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.285898 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.287417 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47db223c-b39e-48be-adbe-d50da747580a" (UID: "47db223c-b39e-48be-adbe-d50da747580a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.304601 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-config-data" (OuterVolumeSpecName: "config-data") pod "47db223c-b39e-48be-adbe-d50da747580a" (UID: "47db223c-b39e-48be-adbe-d50da747580a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.391699 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.391734 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47db223c-b39e-48be-adbe-d50da747580a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.783500 4956 generic.go:334] "Generic (PLEG): container finished" podID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerID="7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263" exitCode=143 Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.783816 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a79ac967-4c58-483a-9ef8-76033c9c5d83","Type":"ContainerDied","Data":"7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263"} Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.786952 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47db223c-b39e-48be-adbe-d50da747580a","Type":"ContainerDied","Data":"6d40c0b3c14ec84aba5e1290c081a648fd6e8ab4fd50aab203a7363a206bfe95"} Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.787001 4956 scope.go:117] "RemoveContainer" containerID="ba77419992750666eff2d90eb215a47f9f74e46273c1571d9d4bcc648f0a0029" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.787144 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.890763 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.900376 4956 scope.go:117] "RemoveContainer" containerID="96aef16228e08aee0b033086705676cc2afd3dd394bb8187b322cf0ce56a077b" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.901278 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.919444 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:58 crc kubenswrapper[4956]: E0930 05:46:58.919827 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" containerName="dnsmasq-dns" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.919839 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" containerName="dnsmasq-dns" Sep 30 05:46:58 crc kubenswrapper[4956]: E0930 05:46:58.919854 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="ceilometer-central-agent" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.919860 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="ceilometer-central-agent" Sep 30 05:46:58 crc kubenswrapper[4956]: E0930 05:46:58.919875 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="ceilometer-notification-agent" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.919881 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="ceilometer-notification-agent" Sep 30 05:46:58 crc kubenswrapper[4956]: E0930 05:46:58.919898 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="sg-core" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.919904 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="sg-core" Sep 30 05:46:58 crc kubenswrapper[4956]: E0930 05:46:58.919916 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="proxy-httpd" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.919922 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="proxy-httpd" Sep 30 05:46:58 crc kubenswrapper[4956]: E0930 05:46:58.919933 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" containerName="init" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.919940 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" containerName="init" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.920133 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="proxy-httpd" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.920160 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="sg-core" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.920171 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="ceilometer-central-agent" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.920181 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="87cb9729-9cb1-4377-834e-b6e5c11e39a3" containerName="dnsmasq-dns" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.920192 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47db223c-b39e-48be-adbe-d50da747580a" containerName="ceilometer-notification-agent" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.922388 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.925293 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.925512 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.934196 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.949082 4956 scope.go:117] "RemoveContainer" containerID="ffd6a03764ccdec20a44f2b1a83894845d8faf098dfb15ac8bec62346120fe0c" Sep 30 05:46:58 crc kubenswrapper[4956]: I0930 05:46:58.980557 4956 scope.go:117] "RemoveContainer" containerID="f1f6bae6d9337fae2dc8e2358a784aeae30bcadafb8b882179d4e55da638473f" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.004475 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-run-httpd\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.004522 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.004560 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-config-data\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.004597 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-scripts\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.004908 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-log-httpd\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.004954 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.005002 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4wrr\" (UniqueName: \"kubernetes.io/projected/88802aae-2f11-483c-abaa-c768177f4f1f-kube-api-access-n4wrr\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.106919 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-config-data\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.106992 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-scripts\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.107055 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-log-httpd\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.107098 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.107219 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4wrr\" (UniqueName: \"kubernetes.io/projected/88802aae-2f11-483c-abaa-c768177f4f1f-kube-api-access-n4wrr\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.107273 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-run-httpd\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.107291 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.110254 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-log-httpd\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.110331 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-run-httpd\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.115458 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.120718 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.128244 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-config-data\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.119827 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-scripts\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.139849 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4wrr\" (UniqueName: \"kubernetes.io/projected/88802aae-2f11-483c-abaa-c768177f4f1f-kube-api-access-n4wrr\") pod \"ceilometer-0\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.174576 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.209258 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-config-data\") pod \"a79ac967-4c58-483a-9ef8-76033c9c5d83\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.209532 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-combined-ca-bundle\") pod \"a79ac967-4c58-483a-9ef8-76033c9c5d83\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.209686 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-public-tls-certs\") pod \"a79ac967-4c58-483a-9ef8-76033c9c5d83\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.209880 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2lw7\" (UniqueName: \"kubernetes.io/projected/a79ac967-4c58-483a-9ef8-76033c9c5d83-kube-api-access-m2lw7\") pod \"a79ac967-4c58-483a-9ef8-76033c9c5d83\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.210016 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-httpd-run\") pod \"a79ac967-4c58-483a-9ef8-76033c9c5d83\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.210107 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a79ac967-4c58-483a-9ef8-76033c9c5d83\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.210223 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-logs\") pod \"a79ac967-4c58-483a-9ef8-76033c9c5d83\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.210343 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-scripts\") pod \"a79ac967-4c58-483a-9ef8-76033c9c5d83\" (UID: \"a79ac967-4c58-483a-9ef8-76033c9c5d83\") " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.212563 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a79ac967-4c58-483a-9ef8-76033c9c5d83" (UID: "a79ac967-4c58-483a-9ef8-76033c9c5d83"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.212866 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-logs" (OuterVolumeSpecName: "logs") pod "a79ac967-4c58-483a-9ef8-76033c9c5d83" (UID: "a79ac967-4c58-483a-9ef8-76033c9c5d83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.230201 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79ac967-4c58-483a-9ef8-76033c9c5d83-kube-api-access-m2lw7" (OuterVolumeSpecName: "kube-api-access-m2lw7") pod "a79ac967-4c58-483a-9ef8-76033c9c5d83" (UID: "a79ac967-4c58-483a-9ef8-76033c9c5d83"). InnerVolumeSpecName "kube-api-access-m2lw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.230463 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-scripts" (OuterVolumeSpecName: "scripts") pod "a79ac967-4c58-483a-9ef8-76033c9c5d83" (UID: "a79ac967-4c58-483a-9ef8-76033c9c5d83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.239728 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.248384 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "a79ac967-4c58-483a-9ef8-76033c9c5d83" (UID: "a79ac967-4c58-483a-9ef8-76033c9c5d83"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.298729 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a79ac967-4c58-483a-9ef8-76033c9c5d83" (UID: "a79ac967-4c58-483a-9ef8-76033c9c5d83"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.302368 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a79ac967-4c58-483a-9ef8-76033c9c5d83" (UID: "a79ac967-4c58-483a-9ef8-76033c9c5d83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.312456 4956 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.312498 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.312508 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79ac967-4c58-483a-9ef8-76033c9c5d83-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.312517 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.312525 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.312536 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.312546 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2lw7\" (UniqueName: \"kubernetes.io/projected/a79ac967-4c58-483a-9ef8-76033c9c5d83-kube-api-access-m2lw7\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.334080 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-config-data" (OuterVolumeSpecName: "config-data") pod "a79ac967-4c58-483a-9ef8-76033c9c5d83" (UID: "a79ac967-4c58-483a-9ef8-76033c9c5d83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.349571 4956 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.413820 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79ac967-4c58-483a-9ef8-76033c9c5d83-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.413847 4956 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.758236 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.843254 4956 generic.go:334] "Generic (PLEG): container finished" podID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerID="f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05" exitCode=0 Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.843337 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a79ac967-4c58-483a-9ef8-76033c9c5d83","Type":"ContainerDied","Data":"f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05"} Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.843378 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a79ac967-4c58-483a-9ef8-76033c9c5d83","Type":"ContainerDied","Data":"06668260a629a8ef55be2a234e2b6c86c96d714d98fca89ad5f346989dfa46a4"} Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.843395 4956 scope.go:117] "RemoveContainer" containerID="f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.843419 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.880292 4956 scope.go:117] "RemoveContainer" containerID="7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.896584 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.907102 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.915827 4956 scope.go:117] "RemoveContainer" containerID="f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05" Sep 30 05:46:59 crc kubenswrapper[4956]: E0930 05:46:59.916364 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05\": container with ID starting with f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05 not found: ID does not exist" containerID="f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.916392 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05"} err="failed to get container status \"f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05\": rpc error: code = NotFound desc = could not find container \"f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05\": container with ID starting with f7e0a4113415c8cd18d9ad5d54734aa752bb2d5c3e592a9f4ba6b1d59f0e7c05 not found: ID does not exist" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.916417 4956 scope.go:117] "RemoveContainer" containerID="7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263" Sep 30 05:46:59 crc kubenswrapper[4956]: E0930 05:46:59.916813 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263\": container with ID starting with 7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263 not found: ID does not exist" containerID="7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.916843 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263"} err="failed to get container status \"7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263\": rpc error: code = NotFound desc = could not find container \"7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263\": container with ID starting with 7f7046b5344484c646bf01b6918a586788f5bb491ff733f7972ad35e4fbdb263 not found: ID does not exist" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.941826 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:46:59 crc kubenswrapper[4956]: E0930 05:46:59.942188 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerName="glance-httpd" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.942204 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerName="glance-httpd" Sep 30 05:46:59 crc kubenswrapper[4956]: E0930 05:46:59.942218 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerName="glance-log" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.942223 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerName="glance-log" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.942417 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerName="glance-log" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.942434 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" containerName="glance-httpd" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.943419 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.962000 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.965480 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 05:46:59 crc kubenswrapper[4956]: I0930 05:46:59.966283 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.078705 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.117823 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.131218 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a834394a-7f87-43b1-aebb-e61b5916077c-logs\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.131298 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.131390 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclrj\" (UniqueName: \"kubernetes.io/projected/a834394a-7f87-43b1-aebb-e61b5916077c-kube-api-access-dclrj\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.131411 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.131459 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.131477 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a834394a-7f87-43b1-aebb-e61b5916077c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.131505 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.131539 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.232827 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.233109 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.233163 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a834394a-7f87-43b1-aebb-e61b5916077c-logs\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.233206 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.233268 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dclrj\" (UniqueName: \"kubernetes.io/projected/a834394a-7f87-43b1-aebb-e61b5916077c-kube-api-access-dclrj\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.233286 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.233355 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.233372 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a834394a-7f87-43b1-aebb-e61b5916077c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.234214 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a834394a-7f87-43b1-aebb-e61b5916077c-logs\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.234242 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.234381 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a834394a-7f87-43b1-aebb-e61b5916077c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.239248 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.239514 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.240134 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.251802 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a834394a-7f87-43b1-aebb-e61b5916077c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.256829 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclrj\" (UniqueName: \"kubernetes.io/projected/a834394a-7f87-43b1-aebb-e61b5916077c-kube-api-access-dclrj\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.271138 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a834394a-7f87-43b1-aebb-e61b5916077c\") " pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.278657 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.354895 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47db223c-b39e-48be-adbe-d50da747580a" path="/var/lib/kubelet/pods/47db223c-b39e-48be-adbe-d50da747580a/volumes" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.355691 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79ac967-4c58-483a-9ef8-76033c9c5d83" path="/var/lib/kubelet/pods/a79ac967-4c58-483a-9ef8-76033c9c5d83/volumes" Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.848557 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 05:47:00 crc kubenswrapper[4956]: W0930 05:47:00.849157 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda834394a_7f87_43b1_aebb_e61b5916077c.slice/crio-8a54e487916b820b525a5274b5feff5ff2f4374b74ba2b904168a8df06b9e601 WatchSource:0}: Error finding container 8a54e487916b820b525a5274b5feff5ff2f4374b74ba2b904168a8df06b9e601: Status 404 returned error can't find the container with id 8a54e487916b820b525a5274b5feff5ff2f4374b74ba2b904168a8df06b9e601 Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.891169 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerStarted","Data":"e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64"} Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.891219 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerStarted","Data":"860611fa1e2b6c7cc94da56845ffa440ecd8f86f8f7cc82e71693432c19d6825"} Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.891460 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerName="cinder-scheduler" containerID="cri-o://978fae24e59f3d8ac596b4f0a4f7fa1b511b92e045f3136efaabf8b5cda33852" gracePeriod=30 Sep 30 05:47:00 crc kubenswrapper[4956]: I0930 05:47:00.891618 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerName="probe" containerID="cri-o://1dba6ae6b9516d9532b383ef5fd837429a2882087fa8f797c670af9b211b4362" gracePeriod=30 Sep 30 05:47:01 crc kubenswrapper[4956]: I0930 05:47:01.059832 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:47:01 crc kubenswrapper[4956]: I0930 05:47:01.060143 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerName="glance-log" containerID="cri-o://b1350995cd78544dc325bbdb0c09a57997f0575e9bbf4b8a54b502be3f92321e" gracePeriod=30 Sep 30 05:47:01 crc kubenswrapper[4956]: I0930 05:47:01.061275 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerName="glance-httpd" containerID="cri-o://ea0b8f4e921fbd058a0594225cbca024b0346555bbd3a1c2aa75e707852254a2" gracePeriod=30 Sep 30 05:47:01 crc kubenswrapper[4956]: I0930 05:47:01.907564 4956 generic.go:334] "Generic (PLEG): container finished" podID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerID="b1350995cd78544dc325bbdb0c09a57997f0575e9bbf4b8a54b502be3f92321e" exitCode=143 Sep 30 05:47:01 crc kubenswrapper[4956]: I0930 05:47:01.908014 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d2aafb3-866b-41ad-bdeb-38e53b81934a","Type":"ContainerDied","Data":"b1350995cd78544dc325bbdb0c09a57997f0575e9bbf4b8a54b502be3f92321e"} Sep 30 05:47:01 crc kubenswrapper[4956]: I0930 05:47:01.916030 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a834394a-7f87-43b1-aebb-e61b5916077c","Type":"ContainerStarted","Data":"de54e36a32eeefe7bd19d708a6f0f6a69903b9e18d90995dfeb498fc121c4817"} Sep 30 05:47:01 crc kubenswrapper[4956]: I0930 05:47:01.916075 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a834394a-7f87-43b1-aebb-e61b5916077c","Type":"ContainerStarted","Data":"8a54e487916b820b525a5274b5feff5ff2f4374b74ba2b904168a8df06b9e601"} Sep 30 05:47:01 crc kubenswrapper[4956]: I0930 05:47:01.935993 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerStarted","Data":"a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b"} Sep 30 05:47:02 crc kubenswrapper[4956]: I0930 05:47:02.721051 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:02 crc kubenswrapper[4956]: I0930 05:47:02.947571 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a834394a-7f87-43b1-aebb-e61b5916077c","Type":"ContainerStarted","Data":"87696866fa25f91e3eca04b3c7255856a3d4536d1981042f2e81e8c9a9bf38d7"} Sep 30 05:47:02 crc kubenswrapper[4956]: I0930 05:47:02.949934 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerStarted","Data":"cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568"} Sep 30 05:47:02 crc kubenswrapper[4956]: I0930 05:47:02.951883 4956 generic.go:334] "Generic (PLEG): container finished" podID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerID="ea0b8f4e921fbd058a0594225cbca024b0346555bbd3a1c2aa75e707852254a2" exitCode=0 Sep 30 05:47:02 crc kubenswrapper[4956]: I0930 05:47:02.951940 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d2aafb3-866b-41ad-bdeb-38e53b81934a","Type":"ContainerDied","Data":"ea0b8f4e921fbd058a0594225cbca024b0346555bbd3a1c2aa75e707852254a2"} Sep 30 05:47:02 crc kubenswrapper[4956]: I0930 05:47:02.965576 4956 generic.go:334] "Generic (PLEG): container finished" podID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerID="1dba6ae6b9516d9532b383ef5fd837429a2882087fa8f797c670af9b211b4362" exitCode=0 Sep 30 05:47:02 crc kubenswrapper[4956]: I0930 05:47:02.965630 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd011aa6-7a4c-44b4-8b90-c1e07e06f779","Type":"ContainerDied","Data":"1dba6ae6b9516d9532b383ef5fd837429a2882087fa8f797c670af9b211b4362"} Sep 30 05:47:02 crc kubenswrapper[4956]: I0930 05:47:02.972990 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.972974559 podStartE2EDuration="3.972974559s" podCreationTimestamp="2025-09-30 05:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:47:02.968480407 +0000 UTC m=+1093.295600932" watchObservedRunningTime="2025-09-30 05:47:02.972974559 +0000 UTC m=+1093.300095084" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.831789 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.913775 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-combined-ca-bundle\") pod \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.913824 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-logs\") pod \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.913864 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-scripts\") pod \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.913902 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.913921 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-httpd-run\") pod \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.914093 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-config-data\") pod \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.914161 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrcpm\" (UniqueName: \"kubernetes.io/projected/7d2aafb3-866b-41ad-bdeb-38e53b81934a-kube-api-access-wrcpm\") pod \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.914195 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-internal-tls-certs\") pod \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\" (UID: \"7d2aafb3-866b-41ad-bdeb-38e53b81934a\") " Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.919890 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-logs" (OuterVolumeSpecName: "logs") pod "7d2aafb3-866b-41ad-bdeb-38e53b81934a" (UID: "7d2aafb3-866b-41ad-bdeb-38e53b81934a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.922060 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "7d2aafb3-866b-41ad-bdeb-38e53b81934a" (UID: "7d2aafb3-866b-41ad-bdeb-38e53b81934a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.924213 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7d2aafb3-866b-41ad-bdeb-38e53b81934a" (UID: "7d2aafb3-866b-41ad-bdeb-38e53b81934a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.927206 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-scripts" (OuterVolumeSpecName: "scripts") pod "7d2aafb3-866b-41ad-bdeb-38e53b81934a" (UID: "7d2aafb3-866b-41ad-bdeb-38e53b81934a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.934714 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2aafb3-866b-41ad-bdeb-38e53b81934a-kube-api-access-wrcpm" (OuterVolumeSpecName: "kube-api-access-wrcpm") pod "7d2aafb3-866b-41ad-bdeb-38e53b81934a" (UID: "7d2aafb3-866b-41ad-bdeb-38e53b81934a"). InnerVolumeSpecName "kube-api-access-wrcpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.962669 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d2aafb3-866b-41ad-bdeb-38e53b81934a" (UID: "7d2aafb3-866b-41ad-bdeb-38e53b81934a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.980600 4956 generic.go:334] "Generic (PLEG): container finished" podID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerID="978fae24e59f3d8ac596b4f0a4f7fa1b511b92e045f3136efaabf8b5cda33852" exitCode=0 Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.980672 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd011aa6-7a4c-44b4-8b90-c1e07e06f779","Type":"ContainerDied","Data":"978fae24e59f3d8ac596b4f0a4f7fa1b511b92e045f3136efaabf8b5cda33852"} Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.983222 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-config-data" (OuterVolumeSpecName: "config-data") pod "7d2aafb3-866b-41ad-bdeb-38e53b81934a" (UID: "7d2aafb3-866b-41ad-bdeb-38e53b81934a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.985697 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerStarted","Data":"3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800"} Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.985910 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="ceilometer-central-agent" containerID="cri-o://e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64" gracePeriod=30 Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.986246 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.986290 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="proxy-httpd" containerID="cri-o://3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800" gracePeriod=30 Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.986420 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="sg-core" containerID="cri-o://cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568" gracePeriod=30 Sep 30 05:47:03 crc kubenswrapper[4956]: I0930 05:47:03.986484 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="ceilometer-notification-agent" containerID="cri-o://a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b" gracePeriod=30 Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.011353 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.011347 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7d2aafb3-866b-41ad-bdeb-38e53b81934a","Type":"ContainerDied","Data":"2742293e6b987c75fb0fe5643f3cd791e1510d598275d9a695949f2e6f04fb51"} Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.011537 4956 scope.go:117] "RemoveContainer" containerID="ea0b8f4e921fbd058a0594225cbca024b0346555bbd3a1c2aa75e707852254a2" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.011512 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d2aafb3-866b-41ad-bdeb-38e53b81934a" (UID: "7d2aafb3-866b-41ad-bdeb-38e53b81934a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.016469 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.016504 4956 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.016513 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.016522 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrcpm\" (UniqueName: \"kubernetes.io/projected/7d2aafb3-866b-41ad-bdeb-38e53b81934a-kube-api-access-wrcpm\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.016535 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.016544 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.016554 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2aafb3-866b-41ad-bdeb-38e53b81934a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.016562 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2aafb3-866b-41ad-bdeb-38e53b81934a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.025620 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.620415047 podStartE2EDuration="6.025598413s" podCreationTimestamp="2025-09-30 05:46:58 +0000 UTC" firstStartedPulling="2025-09-30 05:46:59.83712746 +0000 UTC m=+1090.164247985" lastFinishedPulling="2025-09-30 05:47:03.242310826 +0000 UTC m=+1093.569431351" observedRunningTime="2025-09-30 05:47:04.012614475 +0000 UTC m=+1094.339735020" watchObservedRunningTime="2025-09-30 05:47:04.025598413 +0000 UTC m=+1094.352718938" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.039194 4956 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.064612 4956 scope.go:117] "RemoveContainer" containerID="b1350995cd78544dc325bbdb0c09a57997f0575e9bbf4b8a54b502be3f92321e" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.065445 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.120363 4956 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.222866 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data\") pod \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.222967 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data-custom\") pod \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.223039 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-scripts\") pod \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.223216 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-combined-ca-bundle\") pod \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.223237 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbskz\" (UniqueName: \"kubernetes.io/projected/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-kube-api-access-hbskz\") pod \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.223258 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-etc-machine-id\") pod \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\" (UID: \"cd011aa6-7a4c-44b4-8b90-c1e07e06f779\") " Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.223684 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cd011aa6-7a4c-44b4-8b90-c1e07e06f779" (UID: "cd011aa6-7a4c-44b4-8b90-c1e07e06f779"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.230292 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-kube-api-access-hbskz" (OuterVolumeSpecName: "kube-api-access-hbskz") pod "cd011aa6-7a4c-44b4-8b90-c1e07e06f779" (UID: "cd011aa6-7a4c-44b4-8b90-c1e07e06f779"). InnerVolumeSpecName "kube-api-access-hbskz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.231370 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cd011aa6-7a4c-44b4-8b90-c1e07e06f779" (UID: "cd011aa6-7a4c-44b4-8b90-c1e07e06f779"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.235242 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-scripts" (OuterVolumeSpecName: "scripts") pod "cd011aa6-7a4c-44b4-8b90-c1e07e06f779" (UID: "cd011aa6-7a4c-44b4-8b90-c1e07e06f779"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.310828 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd011aa6-7a4c-44b4-8b90-c1e07e06f779" (UID: "cd011aa6-7a4c-44b4-8b90-c1e07e06f779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.326059 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.326095 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbskz\" (UniqueName: \"kubernetes.io/projected/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-kube-api-access-hbskz\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.326107 4956 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.326134 4956 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.326147 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.359267 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data" (OuterVolumeSpecName: "config-data") pod "cd011aa6-7a4c-44b4-8b90-c1e07e06f779" (UID: "cd011aa6-7a4c-44b4-8b90-c1e07e06f779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.427415 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd011aa6-7a4c-44b4-8b90-c1e07e06f779-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.456551 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.466347 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.487405 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:47:04 crc kubenswrapper[4956]: E0930 05:47:04.487789 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerName="probe" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.487807 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerName="probe" Sep 30 05:47:04 crc kubenswrapper[4956]: E0930 05:47:04.487821 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerName="glance-log" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.487828 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerName="glance-log" Sep 30 05:47:04 crc kubenswrapper[4956]: E0930 05:47:04.487853 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerName="cinder-scheduler" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.487859 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerName="cinder-scheduler" Sep 30 05:47:04 crc kubenswrapper[4956]: E0930 05:47:04.487871 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerName="glance-httpd" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.487877 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerName="glance-httpd" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.488050 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerName="probe" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.488063 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerName="glance-log" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.488076 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" containerName="cinder-scheduler" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.488089 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" containerName="glance-httpd" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.489092 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.490623 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.491237 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.505351 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.631047 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.631101 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.631145 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.631179 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.631197 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.631283 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.631311 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jfn\" (UniqueName: \"kubernetes.io/projected/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-kube-api-access-p2jfn\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.631354 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.734761 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.734810 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.734836 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.734868 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.734883 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.734936 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.734964 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2jfn\" (UniqueName: \"kubernetes.io/projected/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-kube-api-access-p2jfn\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.735008 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.735070 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.735519 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.735670 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.740485 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.741427 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.743689 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.752781 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2jfn\" (UniqueName: \"kubernetes.io/projected/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-kube-api-access-p2jfn\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.752827 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d68b4b8-0213-4951-bf44-a8f7c6a1677c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.811794 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d68b4b8-0213-4951-bf44-a8f7c6a1677c\") " pod="openstack/glance-default-internal-api-0" Sep 30 05:47:04 crc kubenswrapper[4956]: I0930 05:47:04.851442 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.023369 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd011aa6-7a4c-44b4-8b90-c1e07e06f779","Type":"ContainerDied","Data":"41ee0311c70da3cdbe14c259fb3586f8ec7bf0a790d93a31b99a663e55d0faa1"} Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.023421 4956 scope.go:117] "RemoveContainer" containerID="1dba6ae6b9516d9532b383ef5fd837429a2882087fa8f797c670af9b211b4362" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.023566 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.041343 4956 generic.go:334] "Generic (PLEG): container finished" podID="88802aae-2f11-483c-abaa-c768177f4f1f" containerID="3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800" exitCode=0 Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.041375 4956 generic.go:334] "Generic (PLEG): container finished" podID="88802aae-2f11-483c-abaa-c768177f4f1f" containerID="cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568" exitCode=2 Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.041384 4956 generic.go:334] "Generic (PLEG): container finished" podID="88802aae-2f11-483c-abaa-c768177f4f1f" containerID="a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b" exitCode=0 Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.041422 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerDied","Data":"3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800"} Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.041450 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerDied","Data":"cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568"} Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.041461 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerDied","Data":"a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b"} Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.048604 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.064575 4956 scope.go:117] "RemoveContainer" containerID="978fae24e59f3d8ac596b4f0a4f7fa1b511b92e045f3136efaabf8b5cda33852" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.065184 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.074438 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.075969 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.085031 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.087820 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.143244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.143309 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-scripts\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.143371 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wltvk\" (UniqueName: \"kubernetes.io/projected/6a945d1b-fce5-4069-8fb9-6c483a712cd2-kube-api-access-wltvk\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.143412 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.143444 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a945d1b-fce5-4069-8fb9-6c483a712cd2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.143488 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-config-data\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.245208 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.245254 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a945d1b-fce5-4069-8fb9-6c483a712cd2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.245301 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-config-data\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.245384 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.245413 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-scripts\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.245454 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wltvk\" (UniqueName: \"kubernetes.io/projected/6a945d1b-fce5-4069-8fb9-6c483a712cd2-kube-api-access-wltvk\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.246402 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a945d1b-fce5-4069-8fb9-6c483a712cd2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.250482 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-scripts\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.250591 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.251069 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.252070 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a945d1b-fce5-4069-8fb9-6c483a712cd2-config-data\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.262505 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wltvk\" (UniqueName: \"kubernetes.io/projected/6a945d1b-fce5-4069-8fb9-6c483a712cd2-kube-api-access-wltvk\") pod \"cinder-scheduler-0\" (UID: \"6a945d1b-fce5-4069-8fb9-6c483a712cd2\") " pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.417558 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.443228 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.727390 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 05:47:05 crc kubenswrapper[4956]: I0930 05:47:05.884010 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 05:47:05 crc kubenswrapper[4956]: W0930 05:47:05.892650 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a945d1b_fce5_4069_8fb9_6c483a712cd2.slice/crio-ba22b5b60baea0bde808b0083d2a6053dfc3783af4c02d5b69feb5ab136bc53b WatchSource:0}: Error finding container ba22b5b60baea0bde808b0083d2a6053dfc3783af4c02d5b69feb5ab136bc53b: Status 404 returned error can't find the container with id ba22b5b60baea0bde808b0083d2a6053dfc3783af4c02d5b69feb5ab136bc53b Sep 30 05:47:06 crc kubenswrapper[4956]: I0930 05:47:06.084950 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6a945d1b-fce5-4069-8fb9-6c483a712cd2","Type":"ContainerStarted","Data":"ba22b5b60baea0bde808b0083d2a6053dfc3783af4c02d5b69feb5ab136bc53b"} Sep 30 05:47:06 crc kubenswrapper[4956]: I0930 05:47:06.142927 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d68b4b8-0213-4951-bf44-a8f7c6a1677c","Type":"ContainerStarted","Data":"298c7861705830a66400e2bcb833b12ce31f96c85a8df700b4fee7bbebd85cd5"} Sep 30 05:47:06 crc kubenswrapper[4956]: I0930 05:47:06.280164 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:47:06 crc kubenswrapper[4956]: I0930 05:47:06.336298 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:47:06 crc kubenswrapper[4956]: I0930 05:47:06.358239 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d2aafb3-866b-41ad-bdeb-38e53b81934a" path="/var/lib/kubelet/pods/7d2aafb3-866b-41ad-bdeb-38e53b81934a/volumes" Sep 30 05:47:06 crc kubenswrapper[4956]: I0930 05:47:06.359476 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd011aa6-7a4c-44b4-8b90-c1e07e06f779" path="/var/lib/kubelet/pods/cd011aa6-7a4c-44b4-8b90-c1e07e06f779/volumes" Sep 30 05:47:06 crc kubenswrapper[4956]: I0930 05:47:06.510813 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h5ch9"] Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.158920 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6a945d1b-fce5-4069-8fb9-6c483a712cd2","Type":"ContainerStarted","Data":"1ed128b91b9315e048a2c3e2d32943adda6b4aefb936bb27f6c1547287850315"} Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.161139 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d68b4b8-0213-4951-bf44-a8f7c6a1677c","Type":"ContainerStarted","Data":"c65b2d2cbb080cfea77c2a8e3ef0117085c5b3d7a6e86e8b69007f0f7b56bb34"} Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.161190 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d68b4b8-0213-4951-bf44-a8f7c6a1677c","Type":"ContainerStarted","Data":"e3851b292e74089199107dff4cf0c540d141a75f1dc0e1063f7a463ccd4d2a46"} Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.181797 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.181776611 podStartE2EDuration="3.181776611s" podCreationTimestamp="2025-09-30 05:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:47:07.176067322 +0000 UTC m=+1097.503187857" watchObservedRunningTime="2025-09-30 05:47:07.181776611 +0000 UTC m=+1097.508897126" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.608675 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.722616 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-sg-core-conf-yaml\") pod \"88802aae-2f11-483c-abaa-c768177f4f1f\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.722704 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-scripts\") pod \"88802aae-2f11-483c-abaa-c768177f4f1f\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.722741 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4wrr\" (UniqueName: \"kubernetes.io/projected/88802aae-2f11-483c-abaa-c768177f4f1f-kube-api-access-n4wrr\") pod \"88802aae-2f11-483c-abaa-c768177f4f1f\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.722778 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-run-httpd\") pod \"88802aae-2f11-483c-abaa-c768177f4f1f\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.722818 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-config-data\") pod \"88802aae-2f11-483c-abaa-c768177f4f1f\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.722843 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-log-httpd\") pod \"88802aae-2f11-483c-abaa-c768177f4f1f\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.722866 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-combined-ca-bundle\") pod \"88802aae-2f11-483c-abaa-c768177f4f1f\" (UID: \"88802aae-2f11-483c-abaa-c768177f4f1f\") " Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.723967 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "88802aae-2f11-483c-abaa-c768177f4f1f" (UID: "88802aae-2f11-483c-abaa-c768177f4f1f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.725654 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.726513 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "88802aae-2f11-483c-abaa-c768177f4f1f" (UID: "88802aae-2f11-483c-abaa-c768177f4f1f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.730864 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88802aae-2f11-483c-abaa-c768177f4f1f-kube-api-access-n4wrr" (OuterVolumeSpecName: "kube-api-access-n4wrr") pod "88802aae-2f11-483c-abaa-c768177f4f1f" (UID: "88802aae-2f11-483c-abaa-c768177f4f1f"). InnerVolumeSpecName "kube-api-access-n4wrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.745939 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-scripts" (OuterVolumeSpecName: "scripts") pod "88802aae-2f11-483c-abaa-c768177f4f1f" (UID: "88802aae-2f11-483c-abaa-c768177f4f1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.766208 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "88802aae-2f11-483c-abaa-c768177f4f1f" (UID: "88802aae-2f11-483c-abaa-c768177f4f1f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.815307 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88802aae-2f11-483c-abaa-c768177f4f1f" (UID: "88802aae-2f11-483c-abaa-c768177f4f1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.830334 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.830372 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.830382 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4wrr\" (UniqueName: \"kubernetes.io/projected/88802aae-2f11-483c-abaa-c768177f4f1f-kube-api-access-n4wrr\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.830390 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88802aae-2f11-483c-abaa-c768177f4f1f-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.830400 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.840236 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-config-data" (OuterVolumeSpecName: "config-data") pod "88802aae-2f11-483c-abaa-c768177f4f1f" (UID: "88802aae-2f11-483c-abaa-c768177f4f1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:07 crc kubenswrapper[4956]: I0930 05:47:07.932535 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88802aae-2f11-483c-abaa-c768177f4f1f-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.178991 4956 generic.go:334] "Generic (PLEG): container finished" podID="88802aae-2f11-483c-abaa-c768177f4f1f" containerID="e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64" exitCode=0 Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.179079 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerDied","Data":"e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64"} Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.179108 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88802aae-2f11-483c-abaa-c768177f4f1f","Type":"ContainerDied","Data":"860611fa1e2b6c7cc94da56845ffa440ecd8f86f8f7cc82e71693432c19d6825"} Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.179129 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.179138 4956 scope.go:117] "RemoveContainer" containerID="3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.184218 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6a945d1b-fce5-4069-8fb9-6c483a712cd2","Type":"ContainerStarted","Data":"f3dfb5e68881fa4350e0a81bddb7355bf1bf3ccaa9345a1ab3badc9848db3767"} Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.184626 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h5ch9" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="registry-server" containerID="cri-o://ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92" gracePeriod=2 Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.205744 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.205723513 podStartE2EDuration="3.205723513s" podCreationTimestamp="2025-09-30 05:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:47:08.199301901 +0000 UTC m=+1098.526422436" watchObservedRunningTime="2025-09-30 05:47:08.205723513 +0000 UTC m=+1098.532844028" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.216672 4956 scope.go:117] "RemoveContainer" containerID="cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.225703 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.239367 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.239903 4956 scope.go:117] "RemoveContainer" containerID="a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.260991 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:08 crc kubenswrapper[4956]: E0930 05:47:08.261560 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="ceilometer-central-agent" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.261666 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="ceilometer-central-agent" Sep 30 05:47:08 crc kubenswrapper[4956]: E0930 05:47:08.261735 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="ceilometer-notification-agent" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.261788 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="ceilometer-notification-agent" Sep 30 05:47:08 crc kubenswrapper[4956]: E0930 05:47:08.261846 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="sg-core" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.261898 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="sg-core" Sep 30 05:47:08 crc kubenswrapper[4956]: E0930 05:47:08.261951 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="proxy-httpd" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.262004 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="proxy-httpd" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.262247 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="ceilometer-notification-agent" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.262379 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="sg-core" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.262436 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="ceilometer-central-agent" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.262502 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" containerName="proxy-httpd" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.262559 4956 scope.go:117] "RemoveContainer" containerID="e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.264329 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.266500 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.266756 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.310610 4956 scope.go:117] "RemoveContainer" containerID="3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800" Sep 30 05:47:08 crc kubenswrapper[4956]: E0930 05:47:08.311088 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800\": container with ID starting with 3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800 not found: ID does not exist" containerID="3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.311469 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800"} err="failed to get container status \"3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800\": rpc error: code = NotFound desc = could not find container \"3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800\": container with ID starting with 3c6ddb2108e9eb800ace2322839774107954053c6b9bf719f6fc00630e973800 not found: ID does not exist" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.311532 4956 scope.go:117] "RemoveContainer" containerID="cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.311647 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:08 crc kubenswrapper[4956]: E0930 05:47:08.312045 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568\": container with ID starting with cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568 not found: ID does not exist" containerID="cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.312074 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568"} err="failed to get container status \"cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568\": rpc error: code = NotFound desc = could not find container \"cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568\": container with ID starting with cddf2443fe66949c460f19eb12a7867a6e954ad262ac61ec6ce5c41d0b86d568 not found: ID does not exist" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.312093 4956 scope.go:117] "RemoveContainer" containerID="a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b" Sep 30 05:47:08 crc kubenswrapper[4956]: E0930 05:47:08.312320 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b\": container with ID starting with a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b not found: ID does not exist" containerID="a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.312350 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b"} err="failed to get container status \"a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b\": rpc error: code = NotFound desc = could not find container \"a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b\": container with ID starting with a953f5f113670d9b3d9f1483f3303bac2c7d60aade052fce344b545b21253f9b not found: ID does not exist" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.312368 4956 scope.go:117] "RemoveContainer" containerID="e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64" Sep 30 05:47:08 crc kubenswrapper[4956]: E0930 05:47:08.312667 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64\": container with ID starting with e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64 not found: ID does not exist" containerID="e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.312701 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64"} err="failed to get container status \"e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64\": rpc error: code = NotFound desc = could not find container \"e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64\": container with ID starting with e4cf70a2adefd45610d04c7479aa013a760c611157615f81e149ca9fa64b6e64 not found: ID does not exist" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.340098 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-config-data\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.340183 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shwj7\" (UniqueName: \"kubernetes.io/projected/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-kube-api-access-shwj7\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.340290 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-scripts\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.340356 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.340420 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-log-httpd\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.340472 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-run-httpd\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.340635 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.357233 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88802aae-2f11-483c-abaa-c768177f4f1f" path="/var/lib/kubelet/pods/88802aae-2f11-483c-abaa-c768177f4f1f/volumes" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.441909 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-scripts\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.441978 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.442016 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-log-httpd\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.442047 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-run-httpd\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.442152 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.442198 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-config-data\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.442230 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shwj7\" (UniqueName: \"kubernetes.io/projected/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-kube-api-access-shwj7\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.442733 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-run-httpd\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.444226 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-log-httpd\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.447442 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-config-data\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.447739 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.453472 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.456063 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-scripts\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.476525 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shwj7\" (UniqueName: \"kubernetes.io/projected/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-kube-api-access-shwj7\") pod \"ceilometer-0\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.629533 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.692817 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.746999 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tmq\" (UniqueName: \"kubernetes.io/projected/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-kube-api-access-d4tmq\") pod \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.747059 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-catalog-content\") pod \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.747246 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-utilities\") pod \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\" (UID: \"fcdacf1b-ba86-43bc-94af-a78f1ac7146d\") " Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.748332 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-utilities" (OuterVolumeSpecName: "utilities") pod "fcdacf1b-ba86-43bc-94af-a78f1ac7146d" (UID: "fcdacf1b-ba86-43bc-94af-a78f1ac7146d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.755304 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-kube-api-access-d4tmq" (OuterVolumeSpecName: "kube-api-access-d4tmq") pod "fcdacf1b-ba86-43bc-94af-a78f1ac7146d" (UID: "fcdacf1b-ba86-43bc-94af-a78f1ac7146d"). InnerVolumeSpecName "kube-api-access-d4tmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.831095 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcdacf1b-ba86-43bc-94af-a78f1ac7146d" (UID: "fcdacf1b-ba86-43bc-94af-a78f1ac7146d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.849465 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.849497 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:08 crc kubenswrapper[4956]: I0930 05:47:08.849507 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4tmq\" (UniqueName: \"kubernetes.io/projected/fcdacf1b-ba86-43bc-94af-a78f1ac7146d-kube-api-access-d4tmq\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.148407 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.197922 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerStarted","Data":"5f88b94c0a02442f13bbea275fc8fead9dc7f68e068559342c3a47369e2d8321"} Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.201457 4956 generic.go:334] "Generic (PLEG): container finished" podID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerID="ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92" exitCode=0 Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.201533 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5ch9" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.201594 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5ch9" event={"ID":"fcdacf1b-ba86-43bc-94af-a78f1ac7146d","Type":"ContainerDied","Data":"ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92"} Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.201628 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5ch9" event={"ID":"fcdacf1b-ba86-43bc-94af-a78f1ac7146d","Type":"ContainerDied","Data":"5c62f8e49df70caf7cc911f48d410543f2f0fdfe91265e674faaee1cc5fe8acb"} Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.201644 4956 scope.go:117] "RemoveContainer" containerID="ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.231400 4956 scope.go:117] "RemoveContainer" containerID="29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.249161 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h5ch9"] Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.261504 4956 scope.go:117] "RemoveContainer" containerID="0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.264985 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h5ch9"] Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.312418 4956 scope.go:117] "RemoveContainer" containerID="ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92" Sep 30 05:47:09 crc kubenswrapper[4956]: E0930 05:47:09.323774 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92\": container with ID starting with ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92 not found: ID does not exist" containerID="ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.323945 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92"} err="failed to get container status \"ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92\": rpc error: code = NotFound desc = could not find container \"ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92\": container with ID starting with ac2cf1e0dffd1ac5889aaab0c1c9d26fdbcd1133ffdd8975d84ee14dd4f2dd92 not found: ID does not exist" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.324054 4956 scope.go:117] "RemoveContainer" containerID="29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0" Sep 30 05:47:09 crc kubenswrapper[4956]: E0930 05:47:09.325291 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0\": container with ID starting with 29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0 not found: ID does not exist" containerID="29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.325331 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0"} err="failed to get container status \"29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0\": rpc error: code = NotFound desc = could not find container \"29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0\": container with ID starting with 29da40dab5f6db4d465a7cc865e3b8867ee1ed79f208d280f6bc129733cd56d0 not found: ID does not exist" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.325356 4956 scope.go:117] "RemoveContainer" containerID="0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60" Sep 30 05:47:09 crc kubenswrapper[4956]: E0930 05:47:09.325824 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60\": container with ID starting with 0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60 not found: ID does not exist" containerID="0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60" Sep 30 05:47:09 crc kubenswrapper[4956]: I0930 05:47:09.325846 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60"} err="failed to get container status \"0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60\": rpc error: code = NotFound desc = could not find container \"0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60\": container with ID starting with 0501678af21f6fe0a745c8cccfd8186ed6eef4fb6f3658213697b20b5f772d60 not found: ID does not exist" Sep 30 05:47:10 crc kubenswrapper[4956]: I0930 05:47:10.214485 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerStarted","Data":"920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1"} Sep 30 05:47:10 crc kubenswrapper[4956]: I0930 05:47:10.216131 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerStarted","Data":"8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef"} Sep 30 05:47:10 crc kubenswrapper[4956]: I0930 05:47:10.292776 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 05:47:10 crc kubenswrapper[4956]: I0930 05:47:10.292839 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 05:47:10 crc kubenswrapper[4956]: I0930 05:47:10.335730 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 05:47:10 crc kubenswrapper[4956]: I0930 05:47:10.375637 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" path="/var/lib/kubelet/pods/fcdacf1b-ba86-43bc-94af-a78f1ac7146d/volumes" Sep 30 05:47:10 crc kubenswrapper[4956]: I0930 05:47:10.377487 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 05:47:10 crc kubenswrapper[4956]: I0930 05:47:10.417705 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 05:47:11 crc kubenswrapper[4956]: I0930 05:47:11.237616 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerStarted","Data":"63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e"} Sep 30 05:47:11 crc kubenswrapper[4956]: I0930 05:47:11.237939 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 05:47:11 crc kubenswrapper[4956]: I0930 05:47:11.237975 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.249253 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerStarted","Data":"b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e"} Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.249574 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.272699 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7483338229999998 podStartE2EDuration="4.272681207s" podCreationTimestamp="2025-09-30 05:47:08 +0000 UTC" firstStartedPulling="2025-09-30 05:47:09.155318099 +0000 UTC m=+1099.482438634" lastFinishedPulling="2025-09-30 05:47:11.679665453 +0000 UTC m=+1102.006786018" observedRunningTime="2025-09-30 05:47:12.272149501 +0000 UTC m=+1102.599270036" watchObservedRunningTime="2025-09-30 05:47:12.272681207 +0000 UTC m=+1102.599801732" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.666190 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-z68cs"] Sep 30 05:47:12 crc kubenswrapper[4956]: E0930 05:47:12.666698 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="registry-server" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.666717 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="registry-server" Sep 30 05:47:12 crc kubenswrapper[4956]: E0930 05:47:12.666753 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="extract-utilities" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.666761 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="extract-utilities" Sep 30 05:47:12 crc kubenswrapper[4956]: E0930 05:47:12.666774 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="extract-content" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.666780 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="extract-content" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.667010 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcdacf1b-ba86-43bc-94af-a78f1ac7146d" containerName="registry-server" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.667753 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-z68cs" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.684395 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-z68cs"] Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.741852 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xmh\" (UniqueName: \"kubernetes.io/projected/a62ad408-87f6-49ec-a4ff-690217492edd-kube-api-access-h6xmh\") pod \"nova-api-db-create-z68cs\" (UID: \"a62ad408-87f6-49ec-a4ff-690217492edd\") " pod="openstack/nova-api-db-create-z68cs" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.844068 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xmh\" (UniqueName: \"kubernetes.io/projected/a62ad408-87f6-49ec-a4ff-690217492edd-kube-api-access-h6xmh\") pod \"nova-api-db-create-z68cs\" (UID: \"a62ad408-87f6-49ec-a4ff-690217492edd\") " pod="openstack/nova-api-db-create-z68cs" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.858712 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-whg9s"] Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.860212 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-whg9s" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.868860 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xmh\" (UniqueName: \"kubernetes.io/projected/a62ad408-87f6-49ec-a4ff-690217492edd-kube-api-access-h6xmh\") pod \"nova-api-db-create-z68cs\" (UID: \"a62ad408-87f6-49ec-a4ff-690217492edd\") " pod="openstack/nova-api-db-create-z68cs" Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.907272 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-whg9s"] Sep 30 05:47:12 crc kubenswrapper[4956]: I0930 05:47:12.961679 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2h8m\" (UniqueName: \"kubernetes.io/projected/4d8c8135-c4bc-40cf-ad92-95995fba0cd0-kube-api-access-r2h8m\") pod \"nova-cell0-db-create-whg9s\" (UID: \"4d8c8135-c4bc-40cf-ad92-95995fba0cd0\") " pod="openstack/nova-cell0-db-create-whg9s" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.003857 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-z68cs" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.008165 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kkgnj"] Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.009992 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kkgnj" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.033594 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kkgnj"] Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.071289 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk4x7\" (UniqueName: \"kubernetes.io/projected/54dd590b-ab6c-4760-83f9-da40bc524570-kube-api-access-bk4x7\") pod \"nova-cell1-db-create-kkgnj\" (UID: \"54dd590b-ab6c-4760-83f9-da40bc524570\") " pod="openstack/nova-cell1-db-create-kkgnj" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.071525 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2h8m\" (UniqueName: \"kubernetes.io/projected/4d8c8135-c4bc-40cf-ad92-95995fba0cd0-kube-api-access-r2h8m\") pod \"nova-cell0-db-create-whg9s\" (UID: \"4d8c8135-c4bc-40cf-ad92-95995fba0cd0\") " pod="openstack/nova-cell0-db-create-whg9s" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.121768 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2h8m\" (UniqueName: \"kubernetes.io/projected/4d8c8135-c4bc-40cf-ad92-95995fba0cd0-kube-api-access-r2h8m\") pod \"nova-cell0-db-create-whg9s\" (UID: \"4d8c8135-c4bc-40cf-ad92-95995fba0cd0\") " pod="openstack/nova-cell0-db-create-whg9s" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.173433 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk4x7\" (UniqueName: \"kubernetes.io/projected/54dd590b-ab6c-4760-83f9-da40bc524570-kube-api-access-bk4x7\") pod \"nova-cell1-db-create-kkgnj\" (UID: \"54dd590b-ab6c-4760-83f9-da40bc524570\") " pod="openstack/nova-cell1-db-create-kkgnj" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.229907 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk4x7\" (UniqueName: \"kubernetes.io/projected/54dd590b-ab6c-4760-83f9-da40bc524570-kube-api-access-bk4x7\") pod \"nova-cell1-db-create-kkgnj\" (UID: \"54dd590b-ab6c-4760-83f9-da40bc524570\") " pod="openstack/nova-cell1-db-create-kkgnj" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.244744 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-whg9s" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.250402 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kkgnj" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.748080 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-z68cs"] Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.828268 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.828378 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.832927 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 05:47:13 crc kubenswrapper[4956]: I0930 05:47:13.980212 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kkgnj"] Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.101448 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-whg9s"] Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.279060 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kkgnj" event={"ID":"54dd590b-ab6c-4760-83f9-da40bc524570","Type":"ContainerStarted","Data":"a7db0650db19acc370ff420cc6ae3b676ff27ecf2dcf55abedd71d8e9fdf3b13"} Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.280342 4956 generic.go:334] "Generic (PLEG): container finished" podID="a62ad408-87f6-49ec-a4ff-690217492edd" containerID="5ff7491d10e34d87f1d8b7f2e0977c335b771a49054273a019d0ee97d111d953" exitCode=0 Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.281009 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-z68cs" event={"ID":"a62ad408-87f6-49ec-a4ff-690217492edd","Type":"ContainerDied","Data":"5ff7491d10e34d87f1d8b7f2e0977c335b771a49054273a019d0ee97d111d953"} Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.281053 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-z68cs" event={"ID":"a62ad408-87f6-49ec-a4ff-690217492edd","Type":"ContainerStarted","Data":"af20159c658d02f09ac5026945891ed1c4e321a7248741f0606df07480150a50"} Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.282608 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-whg9s" event={"ID":"4d8c8135-c4bc-40cf-ad92-95995fba0cd0","Type":"ContainerStarted","Data":"527f5bd225238cc694d18bb7be06c382f7fd8879b4607a751ccb61c5b266c34c"} Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.852045 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.852087 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.887061 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:14 crc kubenswrapper[4956]: I0930 05:47:14.897426 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.294008 4956 generic.go:334] "Generic (PLEG): container finished" podID="4d8c8135-c4bc-40cf-ad92-95995fba0cd0" containerID="e6cc2a14e90000a60308ba8952958484dc0ca122a291d3702de25f99317c76c8" exitCode=0 Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.294057 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-whg9s" event={"ID":"4d8c8135-c4bc-40cf-ad92-95995fba0cd0","Type":"ContainerDied","Data":"e6cc2a14e90000a60308ba8952958484dc0ca122a291d3702de25f99317c76c8"} Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.297289 4956 generic.go:334] "Generic (PLEG): container finished" podID="54dd590b-ab6c-4760-83f9-da40bc524570" containerID="fd19bf5931c1a9956c3ed67328d1a107bf12d01ddfa8d24940a96d44f2deb2b2" exitCode=0 Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.297351 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kkgnj" event={"ID":"54dd590b-ab6c-4760-83f9-da40bc524570","Type":"ContainerDied","Data":"fd19bf5931c1a9956c3ed67328d1a107bf12d01ddfa8d24940a96d44f2deb2b2"} Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.300237 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.300342 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.611505 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.770435 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-z68cs" Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.952234 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6xmh\" (UniqueName: \"kubernetes.io/projected/a62ad408-87f6-49ec-a4ff-690217492edd-kube-api-access-h6xmh\") pod \"a62ad408-87f6-49ec-a4ff-690217492edd\" (UID: \"a62ad408-87f6-49ec-a4ff-690217492edd\") " Sep 30 05:47:15 crc kubenswrapper[4956]: I0930 05:47:15.963372 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62ad408-87f6-49ec-a4ff-690217492edd-kube-api-access-h6xmh" (OuterVolumeSpecName: "kube-api-access-h6xmh") pod "a62ad408-87f6-49ec-a4ff-690217492edd" (UID: "a62ad408-87f6-49ec-a4ff-690217492edd"). InnerVolumeSpecName "kube-api-access-h6xmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:16 crc kubenswrapper[4956]: I0930 05:47:16.054867 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6xmh\" (UniqueName: \"kubernetes.io/projected/a62ad408-87f6-49ec-a4ff-690217492edd-kube-api-access-h6xmh\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:16 crc kubenswrapper[4956]: I0930 05:47:16.313284 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-z68cs" event={"ID":"a62ad408-87f6-49ec-a4ff-690217492edd","Type":"ContainerDied","Data":"af20159c658d02f09ac5026945891ed1c4e321a7248741f0606df07480150a50"} Sep 30 05:47:16 crc kubenswrapper[4956]: I0930 05:47:16.313330 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-z68cs" Sep 30 05:47:16 crc kubenswrapper[4956]: I0930 05:47:16.313343 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af20159c658d02f09ac5026945891ed1c4e321a7248741f0606df07480150a50" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.003926 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-whg9s" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.012239 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kkgnj" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.096131 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2h8m\" (UniqueName: \"kubernetes.io/projected/4d8c8135-c4bc-40cf-ad92-95995fba0cd0-kube-api-access-r2h8m\") pod \"4d8c8135-c4bc-40cf-ad92-95995fba0cd0\" (UID: \"4d8c8135-c4bc-40cf-ad92-95995fba0cd0\") " Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.096583 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk4x7\" (UniqueName: \"kubernetes.io/projected/54dd590b-ab6c-4760-83f9-da40bc524570-kube-api-access-bk4x7\") pod \"54dd590b-ab6c-4760-83f9-da40bc524570\" (UID: \"54dd590b-ab6c-4760-83f9-da40bc524570\") " Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.105000 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8c8135-c4bc-40cf-ad92-95995fba0cd0-kube-api-access-r2h8m" (OuterVolumeSpecName: "kube-api-access-r2h8m") pod "4d8c8135-c4bc-40cf-ad92-95995fba0cd0" (UID: "4d8c8135-c4bc-40cf-ad92-95995fba0cd0"). InnerVolumeSpecName "kube-api-access-r2h8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.105262 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54dd590b-ab6c-4760-83f9-da40bc524570-kube-api-access-bk4x7" (OuterVolumeSpecName: "kube-api-access-bk4x7") pod "54dd590b-ab6c-4760-83f9-da40bc524570" (UID: "54dd590b-ab6c-4760-83f9-da40bc524570"). InnerVolumeSpecName "kube-api-access-bk4x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.198910 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk4x7\" (UniqueName: \"kubernetes.io/projected/54dd590b-ab6c-4760-83f9-da40bc524570-kube-api-access-bk4x7\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.198943 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2h8m\" (UniqueName: \"kubernetes.io/projected/4d8c8135-c4bc-40cf-ad92-95995fba0cd0-kube-api-access-r2h8m\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.323691 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kkgnj" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.323695 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kkgnj" event={"ID":"54dd590b-ab6c-4760-83f9-da40bc524570","Type":"ContainerDied","Data":"a7db0650db19acc370ff420cc6ae3b676ff27ecf2dcf55abedd71d8e9fdf3b13"} Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.324159 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7db0650db19acc370ff420cc6ae3b676ff27ecf2dcf55abedd71d8e9fdf3b13" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.325535 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.325555 4956 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.325567 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-whg9s" event={"ID":"4d8c8135-c4bc-40cf-ad92-95995fba0cd0","Type":"ContainerDied","Data":"527f5bd225238cc694d18bb7be06c382f7fd8879b4607a751ccb61c5b266c34c"} Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.325620 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="527f5bd225238cc694d18bb7be06c382f7fd8879b4607a751ccb61c5b266c34c" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.325722 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-whg9s" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.727664 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:17 crc kubenswrapper[4956]: I0930 05:47:17.727985 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 05:47:18 crc kubenswrapper[4956]: I0930 05:47:18.073171 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:47:18 crc kubenswrapper[4956]: I0930 05:47:18.073235 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:47:18 crc kubenswrapper[4956]: I0930 05:47:18.073295 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:47:18 crc kubenswrapper[4956]: I0930 05:47:18.074107 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f437f59e0bbde583a02d6871ab0b4a73cd7224d37812609e0ec49e3d0eab2998"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 05:47:18 crc kubenswrapper[4956]: I0930 05:47:18.074174 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://f437f59e0bbde583a02d6871ab0b4a73cd7224d37812609e0ec49e3d0eab2998" gracePeriod=600 Sep 30 05:47:18 crc kubenswrapper[4956]: E0930 05:47:18.140218 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ecd015b_e216_40d8_ae78_711b2a65c193.slice/crio-conmon-f437f59e0bbde583a02d6871ab0b4a73cd7224d37812609e0ec49e3d0eab2998.scope\": RecentStats: unable to find data in memory cache]" Sep 30 05:47:18 crc kubenswrapper[4956]: I0930 05:47:18.336520 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="f437f59e0bbde583a02d6871ab0b4a73cd7224d37812609e0ec49e3d0eab2998" exitCode=0 Sep 30 05:47:18 crc kubenswrapper[4956]: I0930 05:47:18.336595 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"f437f59e0bbde583a02d6871ab0b4a73cd7224d37812609e0ec49e3d0eab2998"} Sep 30 05:47:18 crc kubenswrapper[4956]: I0930 05:47:18.336861 4956 scope.go:117] "RemoveContainer" containerID="357cd0f449d0c4885ee791783ca31e326eaaa7e5f1a04d708b2435c26c26f499" Sep 30 05:47:19 crc kubenswrapper[4956]: I0930 05:47:19.348537 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"d6707b9d26600f66963acd8d20bb882ca13db20f72f914fce4d118ce50f76933"} Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.895032 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e83b-account-create-ljb8m"] Sep 30 05:47:22 crc kubenswrapper[4956]: E0930 05:47:22.896185 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dd590b-ab6c-4760-83f9-da40bc524570" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.896201 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dd590b-ab6c-4760-83f9-da40bc524570" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: E0930 05:47:22.896214 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8c8135-c4bc-40cf-ad92-95995fba0cd0" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.896221 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8c8135-c4bc-40cf-ad92-95995fba0cd0" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: E0930 05:47:22.896268 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62ad408-87f6-49ec-a4ff-690217492edd" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.896277 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62ad408-87f6-49ec-a4ff-690217492edd" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.896529 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8c8135-c4bc-40cf-ad92-95995fba0cd0" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.896554 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="54dd590b-ab6c-4760-83f9-da40bc524570" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.896566 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62ad408-87f6-49ec-a4ff-690217492edd" containerName="mariadb-database-create" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.897378 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e83b-account-create-ljb8m" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.901074 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.908818 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e83b-account-create-ljb8m"] Sep 30 05:47:22 crc kubenswrapper[4956]: I0930 05:47:22.941745 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fqxz\" (UniqueName: \"kubernetes.io/projected/23b6e1d0-5365-4ff6-8d62-63e63a676194-kube-api-access-8fqxz\") pod \"nova-api-e83b-account-create-ljb8m\" (UID: \"23b6e1d0-5365-4ff6-8d62-63e63a676194\") " pod="openstack/nova-api-e83b-account-create-ljb8m" Sep 30 05:47:23 crc kubenswrapper[4956]: I0930 05:47:23.043846 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fqxz\" (UniqueName: \"kubernetes.io/projected/23b6e1d0-5365-4ff6-8d62-63e63a676194-kube-api-access-8fqxz\") pod \"nova-api-e83b-account-create-ljb8m\" (UID: \"23b6e1d0-5365-4ff6-8d62-63e63a676194\") " pod="openstack/nova-api-e83b-account-create-ljb8m" Sep 30 05:47:23 crc kubenswrapper[4956]: I0930 05:47:23.073255 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fqxz\" (UniqueName: \"kubernetes.io/projected/23b6e1d0-5365-4ff6-8d62-63e63a676194-kube-api-access-8fqxz\") pod \"nova-api-e83b-account-create-ljb8m\" (UID: \"23b6e1d0-5365-4ff6-8d62-63e63a676194\") " pod="openstack/nova-api-e83b-account-create-ljb8m" Sep 30 05:47:23 crc kubenswrapper[4956]: I0930 05:47:23.218171 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e83b-account-create-ljb8m" Sep 30 05:47:23 crc kubenswrapper[4956]: I0930 05:47:23.801379 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e83b-account-create-ljb8m"] Sep 30 05:47:23 crc kubenswrapper[4956]: W0930 05:47:23.809190 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23b6e1d0_5365_4ff6_8d62_63e63a676194.slice/crio-82ed652262ba02d2436c84f6f91c6979c5223b9c6f1e3828fc1bf4cdf67686fa WatchSource:0}: Error finding container 82ed652262ba02d2436c84f6f91c6979c5223b9c6f1e3828fc1bf4cdf67686fa: Status 404 returned error can't find the container with id 82ed652262ba02d2436c84f6f91c6979c5223b9c6f1e3828fc1bf4cdf67686fa Sep 30 05:47:24 crc kubenswrapper[4956]: I0930 05:47:24.400349 4956 generic.go:334] "Generic (PLEG): container finished" podID="23b6e1d0-5365-4ff6-8d62-63e63a676194" containerID="d6fc9d8af7e549440a093bfdd6758f6a47a88026851e093bc423bffed4754901" exitCode=0 Sep 30 05:47:24 crc kubenswrapper[4956]: I0930 05:47:24.400458 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e83b-account-create-ljb8m" event={"ID":"23b6e1d0-5365-4ff6-8d62-63e63a676194","Type":"ContainerDied","Data":"d6fc9d8af7e549440a093bfdd6758f6a47a88026851e093bc423bffed4754901"} Sep 30 05:47:24 crc kubenswrapper[4956]: I0930 05:47:24.401323 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e83b-account-create-ljb8m" event={"ID":"23b6e1d0-5365-4ff6-8d62-63e63a676194","Type":"ContainerStarted","Data":"82ed652262ba02d2436c84f6f91c6979c5223b9c6f1e3828fc1bf4cdf67686fa"} Sep 30 05:47:25 crc kubenswrapper[4956]: I0930 05:47:25.862043 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e83b-account-create-ljb8m" Sep 30 05:47:25 crc kubenswrapper[4956]: I0930 05:47:25.896820 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fqxz\" (UniqueName: \"kubernetes.io/projected/23b6e1d0-5365-4ff6-8d62-63e63a676194-kube-api-access-8fqxz\") pod \"23b6e1d0-5365-4ff6-8d62-63e63a676194\" (UID: \"23b6e1d0-5365-4ff6-8d62-63e63a676194\") " Sep 30 05:47:25 crc kubenswrapper[4956]: I0930 05:47:25.903636 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b6e1d0-5365-4ff6-8d62-63e63a676194-kube-api-access-8fqxz" (OuterVolumeSpecName: "kube-api-access-8fqxz") pod "23b6e1d0-5365-4ff6-8d62-63e63a676194" (UID: "23b6e1d0-5365-4ff6-8d62-63e63a676194"). InnerVolumeSpecName "kube-api-access-8fqxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:26 crc kubenswrapper[4956]: I0930 05:47:26.000107 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fqxz\" (UniqueName: \"kubernetes.io/projected/23b6e1d0-5365-4ff6-8d62-63e63a676194-kube-api-access-8fqxz\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:26 crc kubenswrapper[4956]: I0930 05:47:26.430614 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e83b-account-create-ljb8m" event={"ID":"23b6e1d0-5365-4ff6-8d62-63e63a676194","Type":"ContainerDied","Data":"82ed652262ba02d2436c84f6f91c6979c5223b9c6f1e3828fc1bf4cdf67686fa"} Sep 30 05:47:26 crc kubenswrapper[4956]: I0930 05:47:26.430678 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ed652262ba02d2436c84f6f91c6979c5223b9c6f1e3828fc1bf4cdf67686fa" Sep 30 05:47:26 crc kubenswrapper[4956]: I0930 05:47:26.430767 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e83b-account-create-ljb8m" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.030254 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-858b-account-create-bjjkx"] Sep 30 05:47:33 crc kubenswrapper[4956]: E0930 05:47:33.031256 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b6e1d0-5365-4ff6-8d62-63e63a676194" containerName="mariadb-account-create" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.031274 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b6e1d0-5365-4ff6-8d62-63e63a676194" containerName="mariadb-account-create" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.031489 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b6e1d0-5365-4ff6-8d62-63e63a676194" containerName="mariadb-account-create" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.032159 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-858b-account-create-bjjkx" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.042108 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-858b-account-create-bjjkx"] Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.082348 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.149689 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7kw8\" (UniqueName: \"kubernetes.io/projected/f3d5954c-8871-4eb9-aa3d-4ec131b12791-kube-api-access-m7kw8\") pod \"nova-cell0-858b-account-create-bjjkx\" (UID: \"f3d5954c-8871-4eb9-aa3d-4ec131b12791\") " pod="openstack/nova-cell0-858b-account-create-bjjkx" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.164985 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.165264 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="ceilometer-central-agent" containerID="cri-o://8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef" gracePeriod=30 Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.165422 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="proxy-httpd" containerID="cri-o://b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e" gracePeriod=30 Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.165464 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="sg-core" containerID="cri-o://63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e" gracePeriod=30 Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.165495 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="ceilometer-notification-agent" containerID="cri-o://920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1" gracePeriod=30 Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.178846 4956 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.191:3000/\": EOF" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.231152 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f1e7-account-create-nx5br"] Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.232840 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f1e7-account-create-nx5br" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.235191 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.242241 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f1e7-account-create-nx5br"] Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.250978 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7kw8\" (UniqueName: \"kubernetes.io/projected/f3d5954c-8871-4eb9-aa3d-4ec131b12791-kube-api-access-m7kw8\") pod \"nova-cell0-858b-account-create-bjjkx\" (UID: \"f3d5954c-8871-4eb9-aa3d-4ec131b12791\") " pod="openstack/nova-cell0-858b-account-create-bjjkx" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.270137 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7kw8\" (UniqueName: \"kubernetes.io/projected/f3d5954c-8871-4eb9-aa3d-4ec131b12791-kube-api-access-m7kw8\") pod \"nova-cell0-858b-account-create-bjjkx\" (UID: \"f3d5954c-8871-4eb9-aa3d-4ec131b12791\") " pod="openstack/nova-cell0-858b-account-create-bjjkx" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.352772 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qssvp\" (UniqueName: \"kubernetes.io/projected/833452de-c7d3-4b24-939d-13dccbbddadb-kube-api-access-qssvp\") pod \"nova-cell1-f1e7-account-create-nx5br\" (UID: \"833452de-c7d3-4b24-939d-13dccbbddadb\") " pod="openstack/nova-cell1-f1e7-account-create-nx5br" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.398669 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-858b-account-create-bjjkx" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.454502 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qssvp\" (UniqueName: \"kubernetes.io/projected/833452de-c7d3-4b24-939d-13dccbbddadb-kube-api-access-qssvp\") pod \"nova-cell1-f1e7-account-create-nx5br\" (UID: \"833452de-c7d3-4b24-939d-13dccbbddadb\") " pod="openstack/nova-cell1-f1e7-account-create-nx5br" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.480730 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qssvp\" (UniqueName: \"kubernetes.io/projected/833452de-c7d3-4b24-939d-13dccbbddadb-kube-api-access-qssvp\") pod \"nova-cell1-f1e7-account-create-nx5br\" (UID: \"833452de-c7d3-4b24-939d-13dccbbddadb\") " pod="openstack/nova-cell1-f1e7-account-create-nx5br" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.509271 4956 generic.go:334] "Generic (PLEG): container finished" podID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerID="b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e" exitCode=0 Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.509318 4956 generic.go:334] "Generic (PLEG): container finished" podID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerID="63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e" exitCode=2 Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.509342 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerDied","Data":"b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e"} Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.509372 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerDied","Data":"63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e"} Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.550239 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f1e7-account-create-nx5br" Sep 30 05:47:33 crc kubenswrapper[4956]: I0930 05:47:33.866970 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-858b-account-create-bjjkx"] Sep 30 05:47:33 crc kubenswrapper[4956]: W0930 05:47:33.875382 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3d5954c_8871_4eb9_aa3d_4ec131b12791.slice/crio-defe480264772ff4905075cf1e6b7743b5e9aac25a3e96ce526ed70be6fbc029 WatchSource:0}: Error finding container defe480264772ff4905075cf1e6b7743b5e9aac25a3e96ce526ed70be6fbc029: Status 404 returned error can't find the container with id defe480264772ff4905075cf1e6b7743b5e9aac25a3e96ce526ed70be6fbc029 Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.014910 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f1e7-account-create-nx5br"] Sep 30 05:47:34 crc kubenswrapper[4956]: W0930 05:47:34.018142 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833452de_c7d3_4b24_939d_13dccbbddadb.slice/crio-c64b9b90ed3727a46458acd83c91d5fd46e71c86e60367d81bb57597af046658 WatchSource:0}: Error finding container c64b9b90ed3727a46458acd83c91d5fd46e71c86e60367d81bb57597af046658: Status 404 returned error can't find the container with id c64b9b90ed3727a46458acd83c91d5fd46e71c86e60367d81bb57597af046658 Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.520773 4956 generic.go:334] "Generic (PLEG): container finished" podID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerID="8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef" exitCode=0 Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.520854 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerDied","Data":"8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef"} Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.522500 4956 generic.go:334] "Generic (PLEG): container finished" podID="f3d5954c-8871-4eb9-aa3d-4ec131b12791" containerID="ad980e1cc3af3d725cb7ebc8008d86cfe4f4ce1e9d706d0b5e57b0be5200b678" exitCode=0 Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.522528 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-858b-account-create-bjjkx" event={"ID":"f3d5954c-8871-4eb9-aa3d-4ec131b12791","Type":"ContainerDied","Data":"ad980e1cc3af3d725cb7ebc8008d86cfe4f4ce1e9d706d0b5e57b0be5200b678"} Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.522600 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-858b-account-create-bjjkx" event={"ID":"f3d5954c-8871-4eb9-aa3d-4ec131b12791","Type":"ContainerStarted","Data":"defe480264772ff4905075cf1e6b7743b5e9aac25a3e96ce526ed70be6fbc029"} Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.524279 4956 generic.go:334] "Generic (PLEG): container finished" podID="833452de-c7d3-4b24-939d-13dccbbddadb" containerID="93663a57af4168398e0cd80b1148fa6421e0206ef579b26802d61d49f44138e9" exitCode=0 Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.524318 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f1e7-account-create-nx5br" event={"ID":"833452de-c7d3-4b24-939d-13dccbbddadb","Type":"ContainerDied","Data":"93663a57af4168398e0cd80b1148fa6421e0206ef579b26802d61d49f44138e9"} Sep 30 05:47:34 crc kubenswrapper[4956]: I0930 05:47:34.524350 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f1e7-account-create-nx5br" event={"ID":"833452de-c7d3-4b24-939d-13dccbbddadb","Type":"ContainerStarted","Data":"c64b9b90ed3727a46458acd83c91d5fd46e71c86e60367d81bb57597af046658"} Sep 30 05:47:35 crc kubenswrapper[4956]: I0930 05:47:35.970716 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f1e7-account-create-nx5br" Sep 30 05:47:35 crc kubenswrapper[4956]: I0930 05:47:35.976518 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-858b-account-create-bjjkx" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.011052 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7kw8\" (UniqueName: \"kubernetes.io/projected/f3d5954c-8871-4eb9-aa3d-4ec131b12791-kube-api-access-m7kw8\") pod \"f3d5954c-8871-4eb9-aa3d-4ec131b12791\" (UID: \"f3d5954c-8871-4eb9-aa3d-4ec131b12791\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.011250 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qssvp\" (UniqueName: \"kubernetes.io/projected/833452de-c7d3-4b24-939d-13dccbbddadb-kube-api-access-qssvp\") pod \"833452de-c7d3-4b24-939d-13dccbbddadb\" (UID: \"833452de-c7d3-4b24-939d-13dccbbddadb\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.018209 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d5954c-8871-4eb9-aa3d-4ec131b12791-kube-api-access-m7kw8" (OuterVolumeSpecName: "kube-api-access-m7kw8") pod "f3d5954c-8871-4eb9-aa3d-4ec131b12791" (UID: "f3d5954c-8871-4eb9-aa3d-4ec131b12791"). InnerVolumeSpecName "kube-api-access-m7kw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.019474 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833452de-c7d3-4b24-939d-13dccbbddadb-kube-api-access-qssvp" (OuterVolumeSpecName: "kube-api-access-qssvp") pod "833452de-c7d3-4b24-939d-13dccbbddadb" (UID: "833452de-c7d3-4b24-939d-13dccbbddadb"). InnerVolumeSpecName "kube-api-access-qssvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.115148 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7kw8\" (UniqueName: \"kubernetes.io/projected/f3d5954c-8871-4eb9-aa3d-4ec131b12791-kube-api-access-m7kw8\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.115185 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qssvp\" (UniqueName: \"kubernetes.io/projected/833452de-c7d3-4b24-939d-13dccbbddadb-kube-api-access-qssvp\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.542892 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-858b-account-create-bjjkx" event={"ID":"f3d5954c-8871-4eb9-aa3d-4ec131b12791","Type":"ContainerDied","Data":"defe480264772ff4905075cf1e6b7743b5e9aac25a3e96ce526ed70be6fbc029"} Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.542918 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-858b-account-create-bjjkx" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.542934 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="defe480264772ff4905075cf1e6b7743b5e9aac25a3e96ce526ed70be6fbc029" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.544864 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f1e7-account-create-nx5br" event={"ID":"833452de-c7d3-4b24-939d-13dccbbddadb","Type":"ContainerDied","Data":"c64b9b90ed3727a46458acd83c91d5fd46e71c86e60367d81bb57597af046658"} Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.544891 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c64b9b90ed3727a46458acd83c91d5fd46e71c86e60367d81bb57597af046658" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.544936 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f1e7-account-create-nx5br" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.843367 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.930732 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-config-data\") pod \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.930841 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-sg-core-conf-yaml\") pod \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.931003 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-combined-ca-bundle\") pod \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.931076 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-run-httpd\") pod \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.931144 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-log-httpd\") pod \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.931186 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shwj7\" (UniqueName: \"kubernetes.io/projected/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-kube-api-access-shwj7\") pod \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.931204 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-scripts\") pod \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\" (UID: \"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970\") " Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.932488 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" (UID: "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.933061 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" (UID: "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.943336 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-scripts" (OuterVolumeSpecName: "scripts") pod "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" (UID: "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.943396 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-kube-api-access-shwj7" (OuterVolumeSpecName: "kube-api-access-shwj7") pod "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" (UID: "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970"). InnerVolumeSpecName "kube-api-access-shwj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:47:36 crc kubenswrapper[4956]: I0930 05:47:36.964770 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" (UID: "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.009654 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" (UID: "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.033662 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.033702 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shwj7\" (UniqueName: \"kubernetes.io/projected/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-kube-api-access-shwj7\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.033716 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.033727 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.033737 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.033748 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.047443 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-config-data" (OuterVolumeSpecName: "config-data") pod "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" (UID: "12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.135102 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.554652 4956 generic.go:334] "Generic (PLEG): container finished" podID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerID="920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1" exitCode=0 Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.554699 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerDied","Data":"920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1"} Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.554734 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970","Type":"ContainerDied","Data":"5f88b94c0a02442f13bbea275fc8fead9dc7f68e068559342c3a47369e2d8321"} Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.554756 4956 scope.go:117] "RemoveContainer" containerID="b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.554756 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.580956 4956 scope.go:117] "RemoveContainer" containerID="63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.608240 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.621858 4956 scope.go:117] "RemoveContainer" containerID="920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.645071 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.660203 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.660648 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="proxy-httpd" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.660660 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="proxy-httpd" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.660677 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="ceilometer-notification-agent" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.660695 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="ceilometer-notification-agent" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.660710 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d5954c-8871-4eb9-aa3d-4ec131b12791" containerName="mariadb-account-create" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.660716 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d5954c-8871-4eb9-aa3d-4ec131b12791" containerName="mariadb-account-create" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.660730 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="ceilometer-central-agent" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.660737 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="ceilometer-central-agent" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.660757 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="sg-core" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.660763 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="sg-core" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.660784 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833452de-c7d3-4b24-939d-13dccbbddadb" containerName="mariadb-account-create" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.660790 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="833452de-c7d3-4b24-939d-13dccbbddadb" containerName="mariadb-account-create" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.660995 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="ceilometer-central-agent" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.661009 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="sg-core" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.661019 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="ceilometer-notification-agent" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.661034 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d5954c-8871-4eb9-aa3d-4ec131b12791" containerName="mariadb-account-create" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.661054 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" containerName="proxy-httpd" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.661064 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="833452de-c7d3-4b24-939d-13dccbbddadb" containerName="mariadb-account-create" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.662948 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.671744 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.685356 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.685785 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.701905 4956 scope.go:117] "RemoveContainer" containerID="8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.727081 4956 scope.go:117] "RemoveContainer" containerID="b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.727642 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e\": container with ID starting with b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e not found: ID does not exist" containerID="b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.727756 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e"} err="failed to get container status \"b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e\": rpc error: code = NotFound desc = could not find container \"b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e\": container with ID starting with b77e46a5b741ab0faf77974e8d8de5b7bbc32f31f86319d7bba2887151f00d2e not found: ID does not exist" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.727837 4956 scope.go:117] "RemoveContainer" containerID="63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.728474 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e\": container with ID starting with 63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e not found: ID does not exist" containerID="63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.728509 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e"} err="failed to get container status \"63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e\": rpc error: code = NotFound desc = could not find container \"63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e\": container with ID starting with 63b283fbb215a368ba8d8d477cf7c18f3daea7053c625156edd32a57d9e1891e not found: ID does not exist" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.728562 4956 scope.go:117] "RemoveContainer" containerID="920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.728790 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1\": container with ID starting with 920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1 not found: ID does not exist" containerID="920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.728879 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1"} err="failed to get container status \"920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1\": rpc error: code = NotFound desc = could not find container \"920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1\": container with ID starting with 920b1615277045c8add6394b5ce4e44d12e8a86cf916ba12ab5cd8d7db4a5de1 not found: ID does not exist" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.728904 4956 scope.go:117] "RemoveContainer" containerID="8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef" Sep 30 05:47:37 crc kubenswrapper[4956]: E0930 05:47:37.729496 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef\": container with ID starting with 8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef not found: ID does not exist" containerID="8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.729526 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef"} err="failed to get container status \"8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef\": rpc error: code = NotFound desc = could not find container \"8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef\": container with ID starting with 8b65d86ded3624dca360b8feb6d463ec584c784e8d07a0e89c057f7a08f73aef not found: ID does not exist" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.745811 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.745880 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-run-httpd\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.745928 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.745962 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-scripts\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.746012 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwl7z\" (UniqueName: \"kubernetes.io/projected/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-kube-api-access-nwl7z\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.746052 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-config-data\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.746094 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-log-httpd\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.847157 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwl7z\" (UniqueName: \"kubernetes.io/projected/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-kube-api-access-nwl7z\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.847206 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-config-data\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.847250 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-log-httpd\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.847289 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.847310 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-run-httpd\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.847344 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.847374 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-scripts\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.848085 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-log-httpd\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.848824 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-run-httpd\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.851229 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-scripts\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.851582 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-config-data\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.851792 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.861352 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:37 crc kubenswrapper[4956]: I0930 05:47:37.865258 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwl7z\" (UniqueName: \"kubernetes.io/projected/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-kube-api-access-nwl7z\") pod \"ceilometer-0\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " pod="openstack/ceilometer-0" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.003851 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.272766 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s77tj"] Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.277732 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.287731 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.288079 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9xkks" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.288769 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.291425 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s77tj"] Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.354641 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970" path="/var/lib/kubelet/pods/12acdf38-a8f5-4cb7-9ef1-0c7ece0ae970/volumes" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.356099 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62hpt\" (UniqueName: \"kubernetes.io/projected/049f3cac-fecc-4e1f-adcf-95b6c9202221-kube-api-access-62hpt\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.356605 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-scripts\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.356724 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-config-data\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.356797 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.457562 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-config-data\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.457664 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.457734 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62hpt\" (UniqueName: \"kubernetes.io/projected/049f3cac-fecc-4e1f-adcf-95b6c9202221-kube-api-access-62hpt\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.457767 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-scripts\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.465801 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.469729 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-config-data\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.475217 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-scripts\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.477938 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62hpt\" (UniqueName: \"kubernetes.io/projected/049f3cac-fecc-4e1f-adcf-95b6c9202221-kube-api-access-62hpt\") pod \"nova-cell0-conductor-db-sync-s77tj\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.523863 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.569346 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerStarted","Data":"108fa6f173f894d03c21d02a77b3c5bd686a4720c9a07f535566973b12a0ee78"} Sep 30 05:47:38 crc kubenswrapper[4956]: I0930 05:47:38.606686 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:47:39 crc kubenswrapper[4956]: I0930 05:47:39.073263 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s77tj"] Sep 30 05:47:39 crc kubenswrapper[4956]: I0930 05:47:39.582706 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s77tj" event={"ID":"049f3cac-fecc-4e1f-adcf-95b6c9202221","Type":"ContainerStarted","Data":"f1fdaac6d636b160b59dc092a8e9d5b0397218693c43f346336381ba731b28e6"} Sep 30 05:47:39 crc kubenswrapper[4956]: I0930 05:47:39.603172 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerStarted","Data":"a28f3c524561e9d9423a3140f34302fad7d366c0d64ec1bc964d255317a0dfc2"} Sep 30 05:47:39 crc kubenswrapper[4956]: I0930 05:47:39.603214 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerStarted","Data":"f65e2bd3b8ce3687fcb66dcd7ee6bcfe2761c444b8a847b830d90111e6390291"} Sep 30 05:47:40 crc kubenswrapper[4956]: I0930 05:47:40.634276 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerStarted","Data":"28c44ea5ca2bc9d00e4a949cb1a5dae2fd28ae14e0f11a48480d687dbe8fe6e6"} Sep 30 05:47:41 crc kubenswrapper[4956]: I0930 05:47:41.648915 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerStarted","Data":"0470965ed6650d5bb129a0ca4d56c5adc162bbd7dca43dbbf22f39d2dd8211d7"} Sep 30 05:47:41 crc kubenswrapper[4956]: I0930 05:47:41.649531 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 05:47:41 crc kubenswrapper[4956]: I0930 05:47:41.677104 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.970526427 podStartE2EDuration="4.677077659s" podCreationTimestamp="2025-09-30 05:47:37 +0000 UTC" firstStartedPulling="2025-09-30 05:47:38.525667144 +0000 UTC m=+1128.852787669" lastFinishedPulling="2025-09-30 05:47:41.232218376 +0000 UTC m=+1131.559338901" observedRunningTime="2025-09-30 05:47:41.669642299 +0000 UTC m=+1131.996762864" watchObservedRunningTime="2025-09-30 05:47:41.677077659 +0000 UTC m=+1132.004198184" Sep 30 05:47:49 crc kubenswrapper[4956]: I0930 05:47:49.759795 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s77tj" event={"ID":"049f3cac-fecc-4e1f-adcf-95b6c9202221","Type":"ContainerStarted","Data":"07ff28d47764f596ce31b1eb3eb156fbc57d9c27c967fbd11cc84e27f711a6da"} Sep 30 05:47:49 crc kubenswrapper[4956]: I0930 05:47:49.779631 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-s77tj" podStartSLOduration=1.611566107 podStartE2EDuration="11.779614724s" podCreationTimestamp="2025-09-30 05:47:38 +0000 UTC" firstStartedPulling="2025-09-30 05:47:39.083839193 +0000 UTC m=+1129.410959738" lastFinishedPulling="2025-09-30 05:47:49.25188783 +0000 UTC m=+1139.579008355" observedRunningTime="2025-09-30 05:47:49.773706831 +0000 UTC m=+1140.100827366" watchObservedRunningTime="2025-09-30 05:47:49.779614724 +0000 UTC m=+1140.106735249" Sep 30 05:47:59 crc kubenswrapper[4956]: I0930 05:47:59.855031 4956 generic.go:334] "Generic (PLEG): container finished" podID="049f3cac-fecc-4e1f-adcf-95b6c9202221" containerID="07ff28d47764f596ce31b1eb3eb156fbc57d9c27c967fbd11cc84e27f711a6da" exitCode=0 Sep 30 05:47:59 crc kubenswrapper[4956]: I0930 05:47:59.855102 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s77tj" event={"ID":"049f3cac-fecc-4e1f-adcf-95b6c9202221","Type":"ContainerDied","Data":"07ff28d47764f596ce31b1eb3eb156fbc57d9c27c967fbd11cc84e27f711a6da"} Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.216585 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.347414 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-config-data\") pod \"049f3cac-fecc-4e1f-adcf-95b6c9202221\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.347537 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-scripts\") pod \"049f3cac-fecc-4e1f-adcf-95b6c9202221\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.347646 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-combined-ca-bundle\") pod \"049f3cac-fecc-4e1f-adcf-95b6c9202221\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.347670 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62hpt\" (UniqueName: \"kubernetes.io/projected/049f3cac-fecc-4e1f-adcf-95b6c9202221-kube-api-access-62hpt\") pod \"049f3cac-fecc-4e1f-adcf-95b6c9202221\" (UID: \"049f3cac-fecc-4e1f-adcf-95b6c9202221\") " Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.353275 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049f3cac-fecc-4e1f-adcf-95b6c9202221-kube-api-access-62hpt" (OuterVolumeSpecName: "kube-api-access-62hpt") pod "049f3cac-fecc-4e1f-adcf-95b6c9202221" (UID: "049f3cac-fecc-4e1f-adcf-95b6c9202221"). InnerVolumeSpecName "kube-api-access-62hpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.353857 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-scripts" (OuterVolumeSpecName: "scripts") pod "049f3cac-fecc-4e1f-adcf-95b6c9202221" (UID: "049f3cac-fecc-4e1f-adcf-95b6c9202221"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.380712 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-config-data" (OuterVolumeSpecName: "config-data") pod "049f3cac-fecc-4e1f-adcf-95b6c9202221" (UID: "049f3cac-fecc-4e1f-adcf-95b6c9202221"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.384641 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "049f3cac-fecc-4e1f-adcf-95b6c9202221" (UID: "049f3cac-fecc-4e1f-adcf-95b6c9202221"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.450851 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.450901 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.450922 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049f3cac-fecc-4e1f-adcf-95b6c9202221-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.450941 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62hpt\" (UniqueName: \"kubernetes.io/projected/049f3cac-fecc-4e1f-adcf-95b6c9202221-kube-api-access-62hpt\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.879149 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-s77tj" event={"ID":"049f3cac-fecc-4e1f-adcf-95b6c9202221","Type":"ContainerDied","Data":"f1fdaac6d636b160b59dc092a8e9d5b0397218693c43f346336381ba731b28e6"} Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.879194 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1fdaac6d636b160b59dc092a8e9d5b0397218693c43f346336381ba731b28e6" Sep 30 05:48:01 crc kubenswrapper[4956]: I0930 05:48:01.879242 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-s77tj" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.001477 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 05:48:02 crc kubenswrapper[4956]: E0930 05:48:02.002214 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049f3cac-fecc-4e1f-adcf-95b6c9202221" containerName="nova-cell0-conductor-db-sync" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.002230 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="049f3cac-fecc-4e1f-adcf-95b6c9202221" containerName="nova-cell0-conductor-db-sync" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.002401 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="049f3cac-fecc-4e1f-adcf-95b6c9202221" containerName="nova-cell0-conductor-db-sync" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.003059 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.004872 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9xkks" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.005209 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.016973 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.163500 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3275745-4918-4509-ab41-ad6e6653bcc8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.163557 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrx6g\" (UniqueName: \"kubernetes.io/projected/c3275745-4918-4509-ab41-ad6e6653bcc8-kube-api-access-nrx6g\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.163615 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3275745-4918-4509-ab41-ad6e6653bcc8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.266068 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3275745-4918-4509-ab41-ad6e6653bcc8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.266424 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3275745-4918-4509-ab41-ad6e6653bcc8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.267179 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrx6g\" (UniqueName: \"kubernetes.io/projected/c3275745-4918-4509-ab41-ad6e6653bcc8-kube-api-access-nrx6g\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.271550 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3275745-4918-4509-ab41-ad6e6653bcc8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.272238 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3275745-4918-4509-ab41-ad6e6653bcc8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.292972 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrx6g\" (UniqueName: \"kubernetes.io/projected/c3275745-4918-4509-ab41-ad6e6653bcc8-kube-api-access-nrx6g\") pod \"nova-cell0-conductor-0\" (UID: \"c3275745-4918-4509-ab41-ad6e6653bcc8\") " pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.334624 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.819400 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 05:48:02 crc kubenswrapper[4956]: I0930 05:48:02.893288 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c3275745-4918-4509-ab41-ad6e6653bcc8","Type":"ContainerStarted","Data":"fd8de6926dc88693382b796c31101745963e709d484ef87cdaee2db9e0eb42be"} Sep 30 05:48:03 crc kubenswrapper[4956]: I0930 05:48:03.903469 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c3275745-4918-4509-ab41-ad6e6653bcc8","Type":"ContainerStarted","Data":"3da4385a4236c091aa9a61ff155d08c49f46788d77684139c8ab133d1c0bf5d8"} Sep 30 05:48:03 crc kubenswrapper[4956]: I0930 05:48:03.904894 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:08 crc kubenswrapper[4956]: I0930 05:48:08.012653 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 05:48:08 crc kubenswrapper[4956]: I0930 05:48:08.044539 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=7.044515128 podStartE2EDuration="7.044515128s" podCreationTimestamp="2025-09-30 05:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:03.928658697 +0000 UTC m=+1154.255779222" watchObservedRunningTime="2025-09-30 05:48:08.044515128 +0000 UTC m=+1158.371635693" Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.411105 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.414383 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5994e7b3-410f-47d8-9aa1-ddd019b123ec" containerName="kube-state-metrics" containerID="cri-o://0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700" gracePeriod=30 Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.859566 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.953860 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvsss\" (UniqueName: \"kubernetes.io/projected/5994e7b3-410f-47d8-9aa1-ddd019b123ec-kube-api-access-lvsss\") pod \"5994e7b3-410f-47d8-9aa1-ddd019b123ec\" (UID: \"5994e7b3-410f-47d8-9aa1-ddd019b123ec\") " Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.960419 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5994e7b3-410f-47d8-9aa1-ddd019b123ec-kube-api-access-lvsss" (OuterVolumeSpecName: "kube-api-access-lvsss") pod "5994e7b3-410f-47d8-9aa1-ddd019b123ec" (UID: "5994e7b3-410f-47d8-9aa1-ddd019b123ec"). InnerVolumeSpecName "kube-api-access-lvsss". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.999649 4956 generic.go:334] "Generic (PLEG): container finished" podID="5994e7b3-410f-47d8-9aa1-ddd019b123ec" containerID="0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700" exitCode=2 Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.999699 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5994e7b3-410f-47d8-9aa1-ddd019b123ec","Type":"ContainerDied","Data":"0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700"} Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.999730 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5994e7b3-410f-47d8-9aa1-ddd019b123ec","Type":"ContainerDied","Data":"bb6e09d6964112276353a46a1fb2704c2644aa5f5ef18831de79ec6cfcbacdc6"} Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.999749 4956 scope.go:117] "RemoveContainer" containerID="0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700" Sep 30 05:48:11 crc kubenswrapper[4956]: I0930 05:48:11.999895 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.032267 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.035735 4956 scope.go:117] "RemoveContainer" containerID="0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700" Sep 30 05:48:12 crc kubenswrapper[4956]: E0930 05:48:12.036764 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700\": container with ID starting with 0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700 not found: ID does not exist" containerID="0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.036807 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700"} err="failed to get container status \"0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700\": rpc error: code = NotFound desc = could not find container \"0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700\": container with ID starting with 0172961f9ef88a77c0cbcc0080174e536fce7d095828640386a54aeec4e8c700 not found: ID does not exist" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.049924 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.055865 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvsss\" (UniqueName: \"kubernetes.io/projected/5994e7b3-410f-47d8-9aa1-ddd019b123ec-kube-api-access-lvsss\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.060706 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:48:12 crc kubenswrapper[4956]: E0930 05:48:12.064150 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5994e7b3-410f-47d8-9aa1-ddd019b123ec" containerName="kube-state-metrics" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.064236 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="5994e7b3-410f-47d8-9aa1-ddd019b123ec" containerName="kube-state-metrics" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.064504 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="5994e7b3-410f-47d8-9aa1-ddd019b123ec" containerName="kube-state-metrics" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.065405 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.067245 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.067283 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.071366 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.157677 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.157768 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg9n8\" (UniqueName: \"kubernetes.io/projected/1b53d536-4a4f-463d-bae5-360555cd4583-kube-api-access-qg9n8\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.157869 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.157923 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.260095 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.260156 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.260229 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.260271 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg9n8\" (UniqueName: \"kubernetes.io/projected/1b53d536-4a4f-463d-bae5-360555cd4583-kube-api-access-qg9n8\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.264695 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.265484 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.266595 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b53d536-4a4f-463d-bae5-360555cd4583-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.277737 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg9n8\" (UniqueName: \"kubernetes.io/projected/1b53d536-4a4f-463d-bae5-360555cd4583-kube-api-access-qg9n8\") pod \"kube-state-metrics-0\" (UID: \"1b53d536-4a4f-463d-bae5-360555cd4583\") " pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.350964 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5994e7b3-410f-47d8-9aa1-ddd019b123ec" path="/var/lib/kubelet/pods/5994e7b3-410f-47d8-9aa1-ddd019b123ec/volumes" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.366732 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.395763 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.881732 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 05:48:12 crc kubenswrapper[4956]: I0930 05:48:12.892875 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.008180 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b53d536-4a4f-463d-bae5-360555cd4583","Type":"ContainerStarted","Data":"120b98ce16d24725d49c70eedd7f0ac9708918b9cfcfc29625a9d37fd87a729a"} Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.073189 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlrc"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.076925 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.079860 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.080029 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.084798 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlrc"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.175482 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q628p\" (UniqueName: \"kubernetes.io/projected/89d1f309-f110-45e2-ae88-8043eca7553e-kube-api-access-q628p\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.175543 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.175607 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-config-data\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.175632 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-scripts\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.233541 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.234709 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.241107 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.251046 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.279490 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.279591 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-config-data\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.279621 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-scripts\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.279703 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q628p\" (UniqueName: \"kubernetes.io/projected/89d1f309-f110-45e2-ae88-8043eca7553e-kube-api-access-q628p\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.301031 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q628p\" (UniqueName: \"kubernetes.io/projected/89d1f309-f110-45e2-ae88-8043eca7553e-kube-api-access-q628p\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.301206 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.321891 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-config-data\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.333492 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.334948 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-scripts\") pod \"nova-cell0-cell-mapping-xzlrc\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.335153 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.339544 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.353647 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.383285 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.383397 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.383439 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7chbh\" (UniqueName: \"kubernetes.io/projected/3c721d55-4875-4026-b74b-28f4afc31f99-kube-api-access-7chbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.429507 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.486200 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.488150 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7chbh\" (UniqueName: \"kubernetes.io/projected/3c721d55-4875-4026-b74b-28f4afc31f99-kube-api-access-7chbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.488213 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6p2\" (UniqueName: \"kubernetes.io/projected/56141f3a-23d5-4bff-b378-cf54f7269e69-kube-api-access-kx6p2\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.488283 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.488292 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.488310 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56141f3a-23d5-4bff-b378-cf54f7269e69-logs\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.488342 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.488391 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-config-data\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.488434 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.498063 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.507703 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.513470 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.553533 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.561635 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7chbh\" (UniqueName: \"kubernetes.io/projected/3c721d55-4875-4026-b74b-28f4afc31f99-kube-api-access-7chbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.590194 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-rdwj5"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.591798 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.593320 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dmb5\" (UniqueName: \"kubernetes.io/projected/0f13b350-87fa-4854-ba9f-b70130abcf35-kube-api-access-5dmb5\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.593400 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56141f3a-23d5-4bff-b378-cf54f7269e69-logs\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.593432 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.593452 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-config-data\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.593497 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-config-data\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.593532 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f13b350-87fa-4854-ba9f-b70130abcf35-logs\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.593572 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.593605 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6p2\" (UniqueName: \"kubernetes.io/projected/56141f3a-23d5-4bff-b378-cf54f7269e69-kube-api-access-kx6p2\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.594320 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56141f3a-23d5-4bff-b378-cf54f7269e69-logs\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.608003 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.611851 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-config-data\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.675818 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6p2\" (UniqueName: \"kubernetes.io/projected/56141f3a-23d5-4bff-b378-cf54f7269e69-kube-api-access-kx6p2\") pod \"nova-metadata-0\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.685455 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.724076 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-config-data\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.730850 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.733835 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.770307 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.770757 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgwnl\" (UniqueName: \"kubernetes.io/projected/00eae5e6-c773-4bd9-af0f-bc17e1658996-kube-api-access-mgwnl\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.770854 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f13b350-87fa-4854-ba9f-b70130abcf35-logs\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.770929 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.770998 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-config\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.771025 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.771056 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.771083 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.771166 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dmb5\" (UniqueName: \"kubernetes.io/projected/0f13b350-87fa-4854-ba9f-b70130abcf35-kube-api-access-5dmb5\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.771747 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f13b350-87fa-4854-ba9f-b70130abcf35-logs\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.778512 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.779324 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-config-data\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.789439 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.814969 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dmb5\" (UniqueName: \"kubernetes.io/projected/0f13b350-87fa-4854-ba9f-b70130abcf35-kube-api-access-5dmb5\") pod \"nova-api-0\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.833032 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.855499 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.862414 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-rdwj5"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.873732 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-config-data\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.873787 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.875237 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.878250 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgwnl\" (UniqueName: \"kubernetes.io/projected/00eae5e6-c773-4bd9-af0f-bc17e1658996-kube-api-access-mgwnl\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.878550 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-config\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.878578 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.878607 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.878631 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.878712 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.878748 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t29k4\" (UniqueName: \"kubernetes.io/projected/48b3d9a0-6a91-4200-b4e1-98f05330a57c-kube-api-access-t29k4\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.881766 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.882249 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.882460 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.882961 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.884173 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-config\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.944058 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgwnl\" (UniqueName: \"kubernetes.io/projected/00eae5e6-c773-4bd9-af0f-bc17e1658996-kube-api-access-mgwnl\") pod \"dnsmasq-dns-844fc57f6f-rdwj5\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.980963 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.981016 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t29k4\" (UniqueName: \"kubernetes.io/projected/48b3d9a0-6a91-4200-b4e1-98f05330a57c-kube-api-access-t29k4\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:13 crc kubenswrapper[4956]: I0930 05:48:13.981059 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-config-data\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.001961 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t29k4\" (UniqueName: \"kubernetes.io/projected/48b3d9a0-6a91-4200-b4e1-98f05330a57c-kube-api-access-t29k4\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.011789 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-config-data\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.015182 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.067370 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.102462 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.695047419 podStartE2EDuration="2.102443961s" podCreationTimestamp="2025-09-30 05:48:12 +0000 UTC" firstStartedPulling="2025-09-30 05:48:12.892692912 +0000 UTC m=+1163.219813437" lastFinishedPulling="2025-09-30 05:48:13.300089454 +0000 UTC m=+1163.627209979" observedRunningTime="2025-09-30 05:48:14.087410354 +0000 UTC m=+1164.414530879" watchObservedRunningTime="2025-09-30 05:48:14.102443961 +0000 UTC m=+1164.429564486" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.154434 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.154679 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="ceilometer-central-agent" containerID="cri-o://f65e2bd3b8ce3687fcb66dcd7ee6bcfe2761c444b8a847b830d90111e6390291" gracePeriod=30 Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.155082 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="proxy-httpd" containerID="cri-o://0470965ed6650d5bb129a0ca4d56c5adc162bbd7dca43dbbf22f39d2dd8211d7" gracePeriod=30 Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.155134 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="ceilometer-notification-agent" containerID="cri-o://a28f3c524561e9d9423a3140f34302fad7d366c0d64ec1bc964d255317a0dfc2" gracePeriod=30 Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.155206 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="sg-core" containerID="cri-o://28c44ea5ca2bc9d00e4a949cb1a5dae2fd28ae14e0f11a48480d687dbe8fe6e6" gracePeriod=30 Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.205923 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlrc"] Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.239563 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.256900 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.425180 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.431150 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kb9pc"] Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.432435 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.443749 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.444078 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.444289 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kb9pc"] Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.597816 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-config-data\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.598142 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-scripts\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.598164 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.598267 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4nsb\" (UniqueName: \"kubernetes.io/projected/1691e854-b1af-4974-b06a-716b66af43b4-kube-api-access-p4nsb\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.699345 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-config-data\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.699399 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-scripts\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.699417 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.699493 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4nsb\" (UniqueName: \"kubernetes.io/projected/1691e854-b1af-4974-b06a-716b66af43b4-kube-api-access-p4nsb\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.708871 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.714732 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-scripts\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.722526 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-config-data\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.726219 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4nsb\" (UniqueName: \"kubernetes.io/projected/1691e854-b1af-4974-b06a-716b66af43b4-kube-api-access-p4nsb\") pod \"nova-cell1-conductor-db-sync-kb9pc\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.795970 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:14 crc kubenswrapper[4956]: I0930 05:48:14.983280 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.002686 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.090770 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1b53d536-4a4f-463d-bae5-360555cd4583","Type":"ContainerStarted","Data":"628b72a2800fa85745d0d0244f7f7b0035f1662101a1fbaedd81264ec26ad150"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.093995 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56141f3a-23d5-4bff-b378-cf54f7269e69","Type":"ContainerStarted","Data":"7f9150b2dcf9d0743ed232d87c3123802aa46684f06a7993d1b23d976d62e3ce"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.099417 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3c721d55-4875-4026-b74b-28f4afc31f99","Type":"ContainerStarted","Data":"7b4f8befea8df448fdd5b4840029c9817371ad90dece0aa050417144e2d31eee"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.113659 4956 generic.go:334] "Generic (PLEG): container finished" podID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerID="0470965ed6650d5bb129a0ca4d56c5adc162bbd7dca43dbbf22f39d2dd8211d7" exitCode=0 Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.113702 4956 generic.go:334] "Generic (PLEG): container finished" podID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerID="28c44ea5ca2bc9d00e4a949cb1a5dae2fd28ae14e0f11a48480d687dbe8fe6e6" exitCode=2 Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.113714 4956 generic.go:334] "Generic (PLEG): container finished" podID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerID="f65e2bd3b8ce3687fcb66dcd7ee6bcfe2761c444b8a847b830d90111e6390291" exitCode=0 Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.113766 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerDied","Data":"0470965ed6650d5bb129a0ca4d56c5adc162bbd7dca43dbbf22f39d2dd8211d7"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.113797 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerDied","Data":"28c44ea5ca2bc9d00e4a949cb1a5dae2fd28ae14e0f11a48480d687dbe8fe6e6"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.113811 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerDied","Data":"f65e2bd3b8ce3687fcb66dcd7ee6bcfe2761c444b8a847b830d90111e6390291"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.119085 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xzlrc" event={"ID":"89d1f309-f110-45e2-ae88-8043eca7553e","Type":"ContainerStarted","Data":"c74d0674d1bfa2eb3362b4e92462a8e0c95593532f441c16280005e4f0ad7e96"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.119190 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xzlrc" event={"ID":"89d1f309-f110-45e2-ae88-8043eca7553e","Type":"ContainerStarted","Data":"0326577f49650decb1f09bb021f54718718c83807565f7e221b2866e4a28c072"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.125637 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f13b350-87fa-4854-ba9f-b70130abcf35","Type":"ContainerStarted","Data":"b2f00b063e6d692e0182cc17a1089937fce2fbed961dadfa1df7e5b07f7ba294"} Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.136570 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xzlrc" podStartSLOduration=2.136551037 podStartE2EDuration="2.136551037s" podCreationTimestamp="2025-09-30 05:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:15.136085123 +0000 UTC m=+1165.463205648" watchObservedRunningTime="2025-09-30 05:48:15.136551037 +0000 UTC m=+1165.463671572" Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.419402 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.431591 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-rdwj5"] Sep 30 05:48:15 crc kubenswrapper[4956]: I0930 05:48:15.534327 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kb9pc"] Sep 30 05:48:15 crc kubenswrapper[4956]: W0930 05:48:15.595266 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1691e854_b1af_4974_b06a_716b66af43b4.slice/crio-1befaface2af823f827a17a7c213e598a349aa168a6e84bedf17325ee19dd739 WatchSource:0}: Error finding container 1befaface2af823f827a17a7c213e598a349aa168a6e84bedf17325ee19dd739: Status 404 returned error can't find the container with id 1befaface2af823f827a17a7c213e598a349aa168a6e84bedf17325ee19dd739 Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.153568 4956 generic.go:334] "Generic (PLEG): container finished" podID="00eae5e6-c773-4bd9-af0f-bc17e1658996" containerID="4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9" exitCode=0 Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.153743 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" event={"ID":"00eae5e6-c773-4bd9-af0f-bc17e1658996","Type":"ContainerDied","Data":"4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9"} Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.153833 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" event={"ID":"00eae5e6-c773-4bd9-af0f-bc17e1658996","Type":"ContainerStarted","Data":"8c18c366a66f263d5b322b879a82ae850d094cdc98bde4340117d6a8c5a1abbc"} Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.168242 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" event={"ID":"1691e854-b1af-4974-b06a-716b66af43b4","Type":"ContainerStarted","Data":"d5cc994d3c74d52bfdb0e5c3cb7fcae96c1530c1f2b43189ac8f95df6cc1dae4"} Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.168308 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" event={"ID":"1691e854-b1af-4974-b06a-716b66af43b4","Type":"ContainerStarted","Data":"1befaface2af823f827a17a7c213e598a349aa168a6e84bedf17325ee19dd739"} Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.180711 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48b3d9a0-6a91-4200-b4e1-98f05330a57c","Type":"ContainerStarted","Data":"1dac25edb6d09dd1d3a775c85f567f336045c88294fb28dec7bb8b9a4abf0392"} Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.223188 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" podStartSLOduration=2.223167264 podStartE2EDuration="2.223167264s" podCreationTimestamp="2025-09-30 05:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:16.190554832 +0000 UTC m=+1166.517675347" watchObservedRunningTime="2025-09-30 05:48:16.223167264 +0000 UTC m=+1166.550287789" Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.684383 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:16 crc kubenswrapper[4956]: I0930 05:48:16.705517 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.225220 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3c721d55-4875-4026-b74b-28f4afc31f99","Type":"ContainerStarted","Data":"972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f"} Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.225340 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3c721d55-4875-4026-b74b-28f4afc31f99" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f" gracePeriod=30 Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.227268 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48b3d9a0-6a91-4200-b4e1-98f05330a57c","Type":"ContainerStarted","Data":"601dbd4a3d7bbb8ba81c524514bd9420298f122143cf7707f0791238548a0ae4"} Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.229916 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f13b350-87fa-4854-ba9f-b70130abcf35","Type":"ContainerStarted","Data":"7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed"} Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.229967 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f13b350-87fa-4854-ba9f-b70130abcf35","Type":"ContainerStarted","Data":"2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14"} Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.231691 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" event={"ID":"00eae5e6-c773-4bd9-af0f-bc17e1658996","Type":"ContainerStarted","Data":"261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c"} Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.232258 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.233702 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56141f3a-23d5-4bff-b378-cf54f7269e69","Type":"ContainerStarted","Data":"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376"} Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.233734 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56141f3a-23d5-4bff-b378-cf54f7269e69","Type":"ContainerStarted","Data":"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e"} Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.233856 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerName="nova-metadata-log" containerID="cri-o://6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e" gracePeriod=30 Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.234216 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerName="nova-metadata-metadata" containerID="cri-o://9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376" gracePeriod=30 Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.249219 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.402686515 podStartE2EDuration="7.249198859s" podCreationTimestamp="2025-09-30 05:48:13 +0000 UTC" firstStartedPulling="2025-09-30 05:48:14.994383166 +0000 UTC m=+1165.321503691" lastFinishedPulling="2025-09-30 05:48:18.84089551 +0000 UTC m=+1169.168016035" observedRunningTime="2025-09-30 05:48:20.246306459 +0000 UTC m=+1170.573426984" watchObservedRunningTime="2025-09-30 05:48:20.249198859 +0000 UTC m=+1170.576319394" Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.268019 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.434845963 podStartE2EDuration="7.268002082s" podCreationTimestamp="2025-09-30 05:48:13 +0000 UTC" firstStartedPulling="2025-09-30 05:48:15.009372842 +0000 UTC m=+1165.336493367" lastFinishedPulling="2025-09-30 05:48:18.842528931 +0000 UTC m=+1169.169649486" observedRunningTime="2025-09-30 05:48:20.263894235 +0000 UTC m=+1170.591014780" watchObservedRunningTime="2025-09-30 05:48:20.268002082 +0000 UTC m=+1170.595122597" Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.287228 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.909524452 podStartE2EDuration="7.287205818s" podCreationTimestamp="2025-09-30 05:48:13 +0000 UTC" firstStartedPulling="2025-09-30 05:48:15.464774722 +0000 UTC m=+1165.791895247" lastFinishedPulling="2025-09-30 05:48:18.842456088 +0000 UTC m=+1169.169576613" observedRunningTime="2025-09-30 05:48:20.27921411 +0000 UTC m=+1170.606334645" watchObservedRunningTime="2025-09-30 05:48:20.287205818 +0000 UTC m=+1170.614326363" Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.295841 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" podStartSLOduration=7.295821775 podStartE2EDuration="7.295821775s" podCreationTimestamp="2025-09-30 05:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:20.29374621 +0000 UTC m=+1170.620866735" watchObservedRunningTime="2025-09-30 05:48:20.295821775 +0000 UTC m=+1170.622942310" Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.333394 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.935786627 podStartE2EDuration="7.333372091s" podCreationTimestamp="2025-09-30 05:48:13 +0000 UTC" firstStartedPulling="2025-09-30 05:48:14.443639907 +0000 UTC m=+1164.770760432" lastFinishedPulling="2025-09-30 05:48:18.841225371 +0000 UTC m=+1169.168345896" observedRunningTime="2025-09-30 05:48:20.309974904 +0000 UTC m=+1170.637095419" watchObservedRunningTime="2025-09-30 05:48:20.333372091 +0000 UTC m=+1170.660492626" Sep 30 05:48:20 crc kubenswrapper[4956]: I0930 05:48:20.938549 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.041739 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-config-data\") pod \"56141f3a-23d5-4bff-b378-cf54f7269e69\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.041837 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56141f3a-23d5-4bff-b378-cf54f7269e69-logs\") pod \"56141f3a-23d5-4bff-b378-cf54f7269e69\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.041871 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-combined-ca-bundle\") pod \"56141f3a-23d5-4bff-b378-cf54f7269e69\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.041983 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx6p2\" (UniqueName: \"kubernetes.io/projected/56141f3a-23d5-4bff-b378-cf54f7269e69-kube-api-access-kx6p2\") pod \"56141f3a-23d5-4bff-b378-cf54f7269e69\" (UID: \"56141f3a-23d5-4bff-b378-cf54f7269e69\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.043466 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56141f3a-23d5-4bff-b378-cf54f7269e69-logs" (OuterVolumeSpecName: "logs") pod "56141f3a-23d5-4bff-b378-cf54f7269e69" (UID: "56141f3a-23d5-4bff-b378-cf54f7269e69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.052055 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56141f3a-23d5-4bff-b378-cf54f7269e69-kube-api-access-kx6p2" (OuterVolumeSpecName: "kube-api-access-kx6p2") pod "56141f3a-23d5-4bff-b378-cf54f7269e69" (UID: "56141f3a-23d5-4bff-b378-cf54f7269e69"). InnerVolumeSpecName "kube-api-access-kx6p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.074571 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56141f3a-23d5-4bff-b378-cf54f7269e69" (UID: "56141f3a-23d5-4bff-b378-cf54f7269e69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.079986 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-config-data" (OuterVolumeSpecName: "config-data") pod "56141f3a-23d5-4bff-b378-cf54f7269e69" (UID: "56141f3a-23d5-4bff-b378-cf54f7269e69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.145747 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56141f3a-23d5-4bff-b378-cf54f7269e69-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.145783 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.145802 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx6p2\" (UniqueName: \"kubernetes.io/projected/56141f3a-23d5-4bff-b378-cf54f7269e69-kube-api-access-kx6p2\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.145813 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56141f3a-23d5-4bff-b378-cf54f7269e69-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.249626 4956 generic.go:334] "Generic (PLEG): container finished" podID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerID="9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376" exitCode=0 Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.249674 4956 generic.go:334] "Generic (PLEG): container finished" podID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerID="6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e" exitCode=143 Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.249674 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56141f3a-23d5-4bff-b378-cf54f7269e69","Type":"ContainerDied","Data":"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376"} Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.249710 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56141f3a-23d5-4bff-b378-cf54f7269e69","Type":"ContainerDied","Data":"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e"} Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.249681 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.249735 4956 scope.go:117] "RemoveContainer" containerID="9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.249724 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56141f3a-23d5-4bff-b378-cf54f7269e69","Type":"ContainerDied","Data":"7f9150b2dcf9d0743ed232d87c3123802aa46684f06a7993d1b23d976d62e3ce"} Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.266574 4956 generic.go:334] "Generic (PLEG): container finished" podID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerID="a28f3c524561e9d9423a3140f34302fad7d366c0d64ec1bc964d255317a0dfc2" exitCode=0 Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.266825 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerDied","Data":"a28f3c524561e9d9423a3140f34302fad7d366c0d64ec1bc964d255317a0dfc2"} Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.339804 4956 scope.go:117] "RemoveContainer" containerID="6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.370259 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.375972 4956 scope.go:117] "RemoveContainer" containerID="9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376" Sep 30 05:48:21 crc kubenswrapper[4956]: E0930 05:48:21.377249 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376\": container with ID starting with 9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376 not found: ID does not exist" containerID="9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.377290 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376"} err="failed to get container status \"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376\": rpc error: code = NotFound desc = could not find container \"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376\": container with ID starting with 9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376 not found: ID does not exist" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.377311 4956 scope.go:117] "RemoveContainer" containerID="6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e" Sep 30 05:48:21 crc kubenswrapper[4956]: E0930 05:48:21.380143 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e\": container with ID starting with 6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e not found: ID does not exist" containerID="6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.380202 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e"} err="failed to get container status \"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e\": rpc error: code = NotFound desc = could not find container \"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e\": container with ID starting with 6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e not found: ID does not exist" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.380236 4956 scope.go:117] "RemoveContainer" containerID="9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.387838 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.388911 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376"} err="failed to get container status \"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376\": rpc error: code = NotFound desc = could not find container \"9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376\": container with ID starting with 9cedc50284abd2ecf0d6979980df2704cb3997ccf6196aff4140fccdce2fc376 not found: ID does not exist" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.388954 4956 scope.go:117] "RemoveContainer" containerID="6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.389452 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e"} err="failed to get container status \"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e\": rpc error: code = NotFound desc = could not find container \"6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e\": container with ID starting with 6dcc9012e41bcb89e1c286081062df9ed00951e5b687cb3a8bb60f739d3d564e not found: ID does not exist" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.397516 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:21 crc kubenswrapper[4956]: E0930 05:48:21.397820 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerName="nova-metadata-metadata" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.397839 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerName="nova-metadata-metadata" Sep 30 05:48:21 crc kubenswrapper[4956]: E0930 05:48:21.397848 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerName="nova-metadata-log" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.397854 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerName="nova-metadata-log" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.398082 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerName="nova-metadata-metadata" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.398104 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" containerName="nova-metadata-log" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.399191 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.404797 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.405159 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.405337 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.516528 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.553819 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pbjf\" (UniqueName: \"kubernetes.io/projected/b5c9436e-6da5-4040-a6a6-904add2b86b1-kube-api-access-9pbjf\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.553873 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-config-data\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.554069 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.554169 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.554225 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c9436e-6da5-4040-a6a6-904add2b86b1-logs\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.655739 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-config-data\") pod \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.655916 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-log-httpd\") pod \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.655944 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwl7z\" (UniqueName: \"kubernetes.io/projected/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-kube-api-access-nwl7z\") pod \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656033 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-run-httpd\") pod \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656096 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-scripts\") pod \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656157 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-sg-core-conf-yaml\") pod \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656179 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-combined-ca-bundle\") pod \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656325 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" (UID: "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656393 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-config-data\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656454 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" (UID: "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656476 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656508 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656535 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c9436e-6da5-4040-a6a6-904add2b86b1-logs\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656609 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pbjf\" (UniqueName: \"kubernetes.io/projected/b5c9436e-6da5-4040-a6a6-904add2b86b1-kube-api-access-9pbjf\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656655 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.656669 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.657082 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c9436e-6da5-4040-a6a6-904add2b86b1-logs\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.660671 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-scripts" (OuterVolumeSpecName: "scripts") pod "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" (UID: "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.661067 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.661079 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-kube-api-access-nwl7z" (OuterVolumeSpecName: "kube-api-access-nwl7z") pod "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" (UID: "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2"). InnerVolumeSpecName "kube-api-access-nwl7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.661915 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-config-data\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.672657 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pbjf\" (UniqueName: \"kubernetes.io/projected/b5c9436e-6da5-4040-a6a6-904add2b86b1-kube-api-access-9pbjf\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.675025 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " pod="openstack/nova-metadata-0" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.691043 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" (UID: "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.736747 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" (UID: "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.757296 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-config-data" (OuterVolumeSpecName: "config-data") pod "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" (UID: "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.757995 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-config-data\") pod \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\" (UID: \"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2\") " Sep 30 05:48:21 crc kubenswrapper[4956]: W0930 05:48:21.758064 4956 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2/volumes/kubernetes.io~secret/config-data Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.758099 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-config-data" (OuterVolumeSpecName: "config-data") pod "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" (UID: "b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.758407 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.758422 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.758431 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwl7z\" (UniqueName: \"kubernetes.io/projected/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-kube-api-access-nwl7z\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.758441 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.758452 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:21 crc kubenswrapper[4956]: I0930 05:48:21.840890 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.280230 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2","Type":"ContainerDied","Data":"108fa6f173f894d03c21d02a77b3c5bd686a4720c9a07f535566973b12a0ee78"} Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.280330 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.280594 4956 scope.go:117] "RemoveContainer" containerID="0470965ed6650d5bb129a0ca4d56c5adc162bbd7dca43dbbf22f39d2dd8211d7" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.282987 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:22 crc kubenswrapper[4956]: W0930 05:48:22.289151 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5c9436e_6da5_4040_a6a6_904add2b86b1.slice/crio-952509472e3a8370db65ee4442a0d134b0d813208a1eb4f259607b5b3654528d WatchSource:0}: Error finding container 952509472e3a8370db65ee4442a0d134b0d813208a1eb4f259607b5b3654528d: Status 404 returned error can't find the container with id 952509472e3a8370db65ee4442a0d134b0d813208a1eb4f259607b5b3654528d Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.360991 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56141f3a-23d5-4bff-b378-cf54f7269e69" path="/var/lib/kubelet/pods/56141f3a-23d5-4bff-b378-cf54f7269e69/volumes" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.362081 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.380133 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.392167 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:22 crc kubenswrapper[4956]: E0930 05:48:22.392667 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="sg-core" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.392689 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="sg-core" Sep 30 05:48:22 crc kubenswrapper[4956]: E0930 05:48:22.392709 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="ceilometer-notification-agent" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.392719 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="ceilometer-notification-agent" Sep 30 05:48:22 crc kubenswrapper[4956]: E0930 05:48:22.392732 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="ceilometer-central-agent" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.392740 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="ceilometer-central-agent" Sep 30 05:48:22 crc kubenswrapper[4956]: E0930 05:48:22.392760 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="proxy-httpd" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.392768 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="proxy-httpd" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.392999 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="sg-core" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.393018 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="proxy-httpd" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.393040 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="ceilometer-notification-agent" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.393054 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" containerName="ceilometer-central-agent" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.395223 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.399927 4956 scope.go:117] "RemoveContainer" containerID="28c44ea5ca2bc9d00e4a949cb1a5dae2fd28ae14e0f11a48480d687dbe8fe6e6" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.400862 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.447288 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.447592 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.447745 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.463397 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.575265 4956 scope.go:117] "RemoveContainer" containerID="a28f3c524561e9d9423a3140f34302fad7d366c0d64ec1bc964d255317a0dfc2" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.580309 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-log-httpd\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.580386 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.580500 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.580535 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.580564 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-config-data\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.580590 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-run-httpd\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.580678 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-scripts\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.580788 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmvn5\" (UniqueName: \"kubernetes.io/projected/11cda964-5209-4ff7-8cc2-c9cc77d4a105-kube-api-access-vmvn5\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.600730 4956 scope.go:117] "RemoveContainer" containerID="f65e2bd3b8ce3687fcb66dcd7ee6bcfe2761c444b8a847b830d90111e6390291" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682059 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682184 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682213 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682233 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-config-data\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682254 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-run-httpd\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682308 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-scripts\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682363 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmvn5\" (UniqueName: \"kubernetes.io/projected/11cda964-5209-4ff7-8cc2-c9cc77d4a105-kube-api-access-vmvn5\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682394 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-log-httpd\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.682786 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-log-httpd\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.683047 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-run-httpd\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.688418 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.688617 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-scripts\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.688930 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-config-data\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.689211 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.689619 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.720650 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmvn5\" (UniqueName: \"kubernetes.io/projected/11cda964-5209-4ff7-8cc2-c9cc77d4a105-kube-api-access-vmvn5\") pod \"ceilometer-0\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " pod="openstack/ceilometer-0" Sep 30 05:48:22 crc kubenswrapper[4956]: I0930 05:48:22.889217 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:48:23 crc kubenswrapper[4956]: I0930 05:48:23.290406 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5c9436e-6da5-4040-a6a6-904add2b86b1","Type":"ContainerStarted","Data":"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4"} Sep 30 05:48:23 crc kubenswrapper[4956]: I0930 05:48:23.290780 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5c9436e-6da5-4040-a6a6-904add2b86b1","Type":"ContainerStarted","Data":"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657"} Sep 30 05:48:23 crc kubenswrapper[4956]: I0930 05:48:23.290796 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5c9436e-6da5-4040-a6a6-904add2b86b1","Type":"ContainerStarted","Data":"952509472e3a8370db65ee4442a0d134b0d813208a1eb4f259607b5b3654528d"} Sep 30 05:48:23 crc kubenswrapper[4956]: I0930 05:48:23.320407 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.320386735 podStartE2EDuration="2.320386735s" podCreationTimestamp="2025-09-30 05:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:23.309237959 +0000 UTC m=+1173.636358494" watchObservedRunningTime="2025-09-30 05:48:23.320386735 +0000 UTC m=+1173.647507250" Sep 30 05:48:23 crc kubenswrapper[4956]: I0930 05:48:23.372732 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:23 crc kubenswrapper[4956]: W0930 05:48:23.379193 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11cda964_5209_4ff7_8cc2_c9cc77d4a105.slice/crio-a56ae80d109f8efb5431930ceaf93575d8905ca4a272336e75e4e9e9de114b17 WatchSource:0}: Error finding container a56ae80d109f8efb5431930ceaf93575d8905ca4a272336e75e4e9e9de114b17: Status 404 returned error can't find the container with id a56ae80d109f8efb5431930ceaf93575d8905ca4a272336e75e4e9e9de114b17 Sep 30 05:48:23 crc kubenswrapper[4956]: I0930 05:48:23.834837 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 05:48:23 crc kubenswrapper[4956]: I0930 05:48:23.835217 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 05:48:23 crc kubenswrapper[4956]: I0930 05:48:23.856800 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.242079 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.258697 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.258758 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.308941 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-6sqg8"] Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.309983 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.309463 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" podUID="058e635e-0fa8-429a-9a36-b52a9f219228" containerName="dnsmasq-dns" containerID="cri-o://70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94" gracePeriod=10 Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.328237 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerStarted","Data":"867fb896f2fe9db31f7881fd9cd2b4f66860db34c0c3ec7a9949cd9a35b36b75"} Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.328280 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerStarted","Data":"a56ae80d109f8efb5431930ceaf93575d8905ca4a272336e75e4e9e9de114b17"} Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.336026 4956 generic.go:334] "Generic (PLEG): container finished" podID="89d1f309-f110-45e2-ae88-8043eca7553e" containerID="c74d0674d1bfa2eb3362b4e92462a8e0c95593532f441c16280005e4f0ad7e96" exitCode=0 Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.336611 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xzlrc" event={"ID":"89d1f309-f110-45e2-ae88-8043eca7553e","Type":"ContainerDied","Data":"c74d0674d1bfa2eb3362b4e92462a8e0c95593532f441c16280005e4f0ad7e96"} Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.429335 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2" path="/var/lib/kubelet/pods/b967c7f3-d1c7-496d-b63b-85c4ac3ff5a2/volumes" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.460033 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.924274 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.924527 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 05:48:24 crc kubenswrapper[4956]: I0930 05:48:24.992860 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.137872 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzl7z\" (UniqueName: \"kubernetes.io/projected/058e635e-0fa8-429a-9a36-b52a9f219228-kube-api-access-nzl7z\") pod \"058e635e-0fa8-429a-9a36-b52a9f219228\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.137988 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-svc\") pod \"058e635e-0fa8-429a-9a36-b52a9f219228\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.138015 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-sb\") pod \"058e635e-0fa8-429a-9a36-b52a9f219228\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.138079 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-swift-storage-0\") pod \"058e635e-0fa8-429a-9a36-b52a9f219228\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.138144 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-config\") pod \"058e635e-0fa8-429a-9a36-b52a9f219228\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.138218 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-nb\") pod \"058e635e-0fa8-429a-9a36-b52a9f219228\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.145249 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058e635e-0fa8-429a-9a36-b52a9f219228-kube-api-access-nzl7z" (OuterVolumeSpecName: "kube-api-access-nzl7z") pod "058e635e-0fa8-429a-9a36-b52a9f219228" (UID: "058e635e-0fa8-429a-9a36-b52a9f219228"). InnerVolumeSpecName "kube-api-access-nzl7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.209336 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "058e635e-0fa8-429a-9a36-b52a9f219228" (UID: "058e635e-0fa8-429a-9a36-b52a9f219228"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.217009 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "058e635e-0fa8-429a-9a36-b52a9f219228" (UID: "058e635e-0fa8-429a-9a36-b52a9f219228"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.228471 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "058e635e-0fa8-429a-9a36-b52a9f219228" (UID: "058e635e-0fa8-429a-9a36-b52a9f219228"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.240171 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-config" (OuterVolumeSpecName: "config") pod "058e635e-0fa8-429a-9a36-b52a9f219228" (UID: "058e635e-0fa8-429a-9a36-b52a9f219228"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.240507 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-config\") pod \"058e635e-0fa8-429a-9a36-b52a9f219228\" (UID: \"058e635e-0fa8-429a-9a36-b52a9f219228\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.241051 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.241065 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.241074 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzl7z\" (UniqueName: \"kubernetes.io/projected/058e635e-0fa8-429a-9a36-b52a9f219228-kube-api-access-nzl7z\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.241084 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: W0930 05:48:25.241187 4956 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/058e635e-0fa8-429a-9a36-b52a9f219228/volumes/kubernetes.io~configmap/config Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.241199 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-config" (OuterVolumeSpecName: "config") pod "058e635e-0fa8-429a-9a36-b52a9f219228" (UID: "058e635e-0fa8-429a-9a36-b52a9f219228"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.268726 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "058e635e-0fa8-429a-9a36-b52a9f219228" (UID: "058e635e-0fa8-429a-9a36-b52a9f219228"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.342169 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.342207 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/058e635e-0fa8-429a-9a36-b52a9f219228-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.347873 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerStarted","Data":"a529e05b04d10fd626daa89a16f879d7035c7cf4b1ca1a8debc1da6631057ed0"} Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.349911 4956 generic.go:334] "Generic (PLEG): container finished" podID="058e635e-0fa8-429a-9a36-b52a9f219228" containerID="70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94" exitCode=0 Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.350034 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.350153 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" event={"ID":"058e635e-0fa8-429a-9a36-b52a9f219228","Type":"ContainerDied","Data":"70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94"} Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.350182 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-6sqg8" event={"ID":"058e635e-0fa8-429a-9a36-b52a9f219228","Type":"ContainerDied","Data":"b681bca102512e48d5482d73101083864a0d9f7da333489b0d824268a37ef3d6"} Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.350202 4956 scope.go:117] "RemoveContainer" containerID="70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.400530 4956 scope.go:117] "RemoveContainer" containerID="d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.435472 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-6sqg8"] Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.448602 4956 scope.go:117] "RemoveContainer" containerID="70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94" Sep 30 05:48:25 crc kubenswrapper[4956]: E0930 05:48:25.467451 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94\": container with ID starting with 70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94 not found: ID does not exist" containerID="70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.467517 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94"} err="failed to get container status \"70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94\": rpc error: code = NotFound desc = could not find container \"70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94\": container with ID starting with 70557e12567a1d1566bdf019ee986175be447619ba1271e024ff001eb9baff94 not found: ID does not exist" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.467547 4956 scope.go:117] "RemoveContainer" containerID="d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3" Sep 30 05:48:25 crc kubenswrapper[4956]: E0930 05:48:25.471748 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3\": container with ID starting with d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3 not found: ID does not exist" containerID="d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.472778 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3"} err="failed to get container status \"d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3\": rpc error: code = NotFound desc = could not find container \"d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3\": container with ID starting with d3479e8fa45376bbc823172d6c94c1fa5194109c18424b5588c1a682a89478a3 not found: ID does not exist" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.477634 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-6sqg8"] Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.680588 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.752678 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-scripts\") pod \"89d1f309-f110-45e2-ae88-8043eca7553e\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.752746 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-combined-ca-bundle\") pod \"89d1f309-f110-45e2-ae88-8043eca7553e\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.752841 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q628p\" (UniqueName: \"kubernetes.io/projected/89d1f309-f110-45e2-ae88-8043eca7553e-kube-api-access-q628p\") pod \"89d1f309-f110-45e2-ae88-8043eca7553e\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.752864 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-config-data\") pod \"89d1f309-f110-45e2-ae88-8043eca7553e\" (UID: \"89d1f309-f110-45e2-ae88-8043eca7553e\") " Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.758000 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-scripts" (OuterVolumeSpecName: "scripts") pod "89d1f309-f110-45e2-ae88-8043eca7553e" (UID: "89d1f309-f110-45e2-ae88-8043eca7553e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.760345 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d1f309-f110-45e2-ae88-8043eca7553e-kube-api-access-q628p" (OuterVolumeSpecName: "kube-api-access-q628p") pod "89d1f309-f110-45e2-ae88-8043eca7553e" (UID: "89d1f309-f110-45e2-ae88-8043eca7553e"). InnerVolumeSpecName "kube-api-access-q628p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.788269 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89d1f309-f110-45e2-ae88-8043eca7553e" (UID: "89d1f309-f110-45e2-ae88-8043eca7553e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.794354 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-config-data" (OuterVolumeSpecName: "config-data") pod "89d1f309-f110-45e2-ae88-8043eca7553e" (UID: "89d1f309-f110-45e2-ae88-8043eca7553e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.854815 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.855205 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q628p\" (UniqueName: \"kubernetes.io/projected/89d1f309-f110-45e2-ae88-8043eca7553e-kube-api-access-q628p\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.855217 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:25 crc kubenswrapper[4956]: I0930 05:48:25.855227 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89d1f309-f110-45e2-ae88-8043eca7553e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.363348 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058e635e-0fa8-429a-9a36-b52a9f219228" path="/var/lib/kubelet/pods/058e635e-0fa8-429a-9a36-b52a9f219228/volumes" Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.379783 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerStarted","Data":"6ebd9160643d20d0700c5ec18b29036d9246f10d4da1e2c05bed8928cc93b6d6"} Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.393408 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xzlrc" event={"ID":"89d1f309-f110-45e2-ae88-8043eca7553e","Type":"ContainerDied","Data":"0326577f49650decb1f09bb021f54718718c83807565f7e221b2866e4a28c072"} Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.393461 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0326577f49650decb1f09bb021f54718718c83807565f7e221b2866e4a28c072" Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.393541 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xzlrc" Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.531273 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.531603 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-log" containerID="cri-o://2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14" gracePeriod=30 Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.531749 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-api" containerID="cri-o://7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed" gracePeriod=30 Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.540360 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.540612 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="48b3d9a0-6a91-4200-b4e1-98f05330a57c" containerName="nova-scheduler-scheduler" containerID="cri-o://601dbd4a3d7bbb8ba81c524514bd9420298f122143cf7707f0791238548a0ae4" gracePeriod=30 Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.555888 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.556124 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerName="nova-metadata-log" containerID="cri-o://861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657" gracePeriod=30 Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.556200 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerName="nova-metadata-metadata" containerID="cri-o://aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4" gracePeriod=30 Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.841488 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 05:48:26 crc kubenswrapper[4956]: I0930 05:48:26.841804 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.202268 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.291295 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pbjf\" (UniqueName: \"kubernetes.io/projected/b5c9436e-6da5-4040-a6a6-904add2b86b1-kube-api-access-9pbjf\") pod \"b5c9436e-6da5-4040-a6a6-904add2b86b1\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.291558 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-nova-metadata-tls-certs\") pod \"b5c9436e-6da5-4040-a6a6-904add2b86b1\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.291662 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c9436e-6da5-4040-a6a6-904add2b86b1-logs\") pod \"b5c9436e-6da5-4040-a6a6-904add2b86b1\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.291683 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-config-data\") pod \"b5c9436e-6da5-4040-a6a6-904add2b86b1\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.291775 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-combined-ca-bundle\") pod \"b5c9436e-6da5-4040-a6a6-904add2b86b1\" (UID: \"b5c9436e-6da5-4040-a6a6-904add2b86b1\") " Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.294466 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c9436e-6da5-4040-a6a6-904add2b86b1-logs" (OuterVolumeSpecName: "logs") pod "b5c9436e-6da5-4040-a6a6-904add2b86b1" (UID: "b5c9436e-6da5-4040-a6a6-904add2b86b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.302044 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c9436e-6da5-4040-a6a6-904add2b86b1-kube-api-access-9pbjf" (OuterVolumeSpecName: "kube-api-access-9pbjf") pod "b5c9436e-6da5-4040-a6a6-904add2b86b1" (UID: "b5c9436e-6da5-4040-a6a6-904add2b86b1"). InnerVolumeSpecName "kube-api-access-9pbjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.337299 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-config-data" (OuterVolumeSpecName: "config-data") pod "b5c9436e-6da5-4040-a6a6-904add2b86b1" (UID: "b5c9436e-6da5-4040-a6a6-904add2b86b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.357424 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5c9436e-6da5-4040-a6a6-904add2b86b1" (UID: "b5c9436e-6da5-4040-a6a6-904add2b86b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.363818 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b5c9436e-6da5-4040-a6a6-904add2b86b1" (UID: "b5c9436e-6da5-4040-a6a6-904add2b86b1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.395147 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c9436e-6da5-4040-a6a6-904add2b86b1-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.395173 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.395184 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.395215 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pbjf\" (UniqueName: \"kubernetes.io/projected/b5c9436e-6da5-4040-a6a6-904add2b86b1-kube-api-access-9pbjf\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.395226 4956 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5c9436e-6da5-4040-a6a6-904add2b86b1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.410722 4956 generic.go:334] "Generic (PLEG): container finished" podID="1691e854-b1af-4974-b06a-716b66af43b4" containerID="d5cc994d3c74d52bfdb0e5c3cb7fcae96c1530c1f2b43189ac8f95df6cc1dae4" exitCode=0 Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.410808 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" event={"ID":"1691e854-b1af-4974-b06a-716b66af43b4","Type":"ContainerDied","Data":"d5cc994d3c74d52bfdb0e5c3cb7fcae96c1530c1f2b43189ac8f95df6cc1dae4"} Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.413648 4956 generic.go:334] "Generic (PLEG): container finished" podID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerID="aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4" exitCode=0 Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.413669 4956 generic.go:334] "Generic (PLEG): container finished" podID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerID="861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657" exitCode=143 Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.413731 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.415213 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5c9436e-6da5-4040-a6a6-904add2b86b1","Type":"ContainerDied","Data":"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4"} Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.415264 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5c9436e-6da5-4040-a6a6-904add2b86b1","Type":"ContainerDied","Data":"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657"} Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.415281 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5c9436e-6da5-4040-a6a6-904add2b86b1","Type":"ContainerDied","Data":"952509472e3a8370db65ee4442a0d134b0d813208a1eb4f259607b5b3654528d"} Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.415299 4956 scope.go:117] "RemoveContainer" containerID="aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.427381 4956 generic.go:334] "Generic (PLEG): container finished" podID="48b3d9a0-6a91-4200-b4e1-98f05330a57c" containerID="601dbd4a3d7bbb8ba81c524514bd9420298f122143cf7707f0791238548a0ae4" exitCode=0 Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.427432 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48b3d9a0-6a91-4200-b4e1-98f05330a57c","Type":"ContainerDied","Data":"601dbd4a3d7bbb8ba81c524514bd9420298f122143cf7707f0791238548a0ae4"} Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.429143 4956 generic.go:334] "Generic (PLEG): container finished" podID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerID="2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14" exitCode=143 Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.429174 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f13b350-87fa-4854-ba9f-b70130abcf35","Type":"ContainerDied","Data":"2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14"} Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.498791 4956 scope.go:117] "RemoveContainer" containerID="861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.501172 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.514727 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.552181 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:27 crc kubenswrapper[4956]: E0930 05:48:27.553097 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d1f309-f110-45e2-ae88-8043eca7553e" containerName="nova-manage" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553123 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d1f309-f110-45e2-ae88-8043eca7553e" containerName="nova-manage" Sep 30 05:48:27 crc kubenswrapper[4956]: E0930 05:48:27.553145 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerName="nova-metadata-log" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553152 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerName="nova-metadata-log" Sep 30 05:48:27 crc kubenswrapper[4956]: E0930 05:48:27.553187 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058e635e-0fa8-429a-9a36-b52a9f219228" containerName="dnsmasq-dns" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553196 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e635e-0fa8-429a-9a36-b52a9f219228" containerName="dnsmasq-dns" Sep 30 05:48:27 crc kubenswrapper[4956]: E0930 05:48:27.553236 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058e635e-0fa8-429a-9a36-b52a9f219228" containerName="init" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553245 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e635e-0fa8-429a-9a36-b52a9f219228" containerName="init" Sep 30 05:48:27 crc kubenswrapper[4956]: E0930 05:48:27.553262 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerName="nova-metadata-metadata" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553274 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerName="nova-metadata-metadata" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553690 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="058e635e-0fa8-429a-9a36-b52a9f219228" containerName="dnsmasq-dns" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553712 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d1f309-f110-45e2-ae88-8043eca7553e" containerName="nova-manage" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553733 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerName="nova-metadata-log" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.553754 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" containerName="nova-metadata-metadata" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.555652 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.563484 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.563708 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.569378 4956 scope.go:117] "RemoveContainer" containerID="aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4" Sep 30 05:48:27 crc kubenswrapper[4956]: E0930 05:48:27.571251 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4\": container with ID starting with aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4 not found: ID does not exist" containerID="aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.571293 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4"} err="failed to get container status \"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4\": rpc error: code = NotFound desc = could not find container \"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4\": container with ID starting with aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4 not found: ID does not exist" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.571316 4956 scope.go:117] "RemoveContainer" containerID="861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657" Sep 30 05:48:27 crc kubenswrapper[4956]: E0930 05:48:27.572787 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657\": container with ID starting with 861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657 not found: ID does not exist" containerID="861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.572811 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657"} err="failed to get container status \"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657\": rpc error: code = NotFound desc = could not find container \"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657\": container with ID starting with 861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657 not found: ID does not exist" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.572827 4956 scope.go:117] "RemoveContainer" containerID="aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.575790 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4"} err="failed to get container status \"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4\": rpc error: code = NotFound desc = could not find container \"aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4\": container with ID starting with aa5f8fb053e5303d3931a65f82ba97d834050482486360082d5a3274d9ed10d4 not found: ID does not exist" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.575856 4956 scope.go:117] "RemoveContainer" containerID="861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.577851 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657"} err="failed to get container status \"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657\": rpc error: code = NotFound desc = could not find container \"861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657\": container with ID starting with 861a518bd7f4ef24f06f48accc70111e01ec3469ef3e84f7f6efb76fc4ae6657 not found: ID does not exist" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.595093 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.699872 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.704122 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wff9k\" (UniqueName: \"kubernetes.io/projected/2f34a355-be28-482b-a242-f113c5c9fd59-kube-api-access-wff9k\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.704782 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.704844 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f34a355-be28-482b-a242-f113c5c9fd59-logs\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.704901 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.704940 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-config-data\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.806310 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t29k4\" (UniqueName: \"kubernetes.io/projected/48b3d9a0-6a91-4200-b4e1-98f05330a57c-kube-api-access-t29k4\") pod \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.806709 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-combined-ca-bundle\") pod \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.806776 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-config-data\") pod \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\" (UID: \"48b3d9a0-6a91-4200-b4e1-98f05330a57c\") " Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.807086 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.807192 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f34a355-be28-482b-a242-f113c5c9fd59-logs\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.807261 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.807304 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-config-data\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.807338 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wff9k\" (UniqueName: \"kubernetes.io/projected/2f34a355-be28-482b-a242-f113c5c9fd59-kube-api-access-wff9k\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.807930 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f34a355-be28-482b-a242-f113c5c9fd59-logs\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.810267 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.814264 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b3d9a0-6a91-4200-b4e1-98f05330a57c-kube-api-access-t29k4" (OuterVolumeSpecName: "kube-api-access-t29k4") pod "48b3d9a0-6a91-4200-b4e1-98f05330a57c" (UID: "48b3d9a0-6a91-4200-b4e1-98f05330a57c"). InnerVolumeSpecName "kube-api-access-t29k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.814767 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-config-data\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.823720 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wff9k\" (UniqueName: \"kubernetes.io/projected/2f34a355-be28-482b-a242-f113c5c9fd59-kube-api-access-wff9k\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.824278 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.835093 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48b3d9a0-6a91-4200-b4e1-98f05330a57c" (UID: "48b3d9a0-6a91-4200-b4e1-98f05330a57c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.842283 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-config-data" (OuterVolumeSpecName: "config-data") pod "48b3d9a0-6a91-4200-b4e1-98f05330a57c" (UID: "48b3d9a0-6a91-4200-b4e1-98f05330a57c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.892720 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.909711 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t29k4\" (UniqueName: \"kubernetes.io/projected/48b3d9a0-6a91-4200-b4e1-98f05330a57c-kube-api-access-t29k4\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.909750 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:27 crc kubenswrapper[4956]: I0930 05:48:27.909762 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b3d9a0-6a91-4200-b4e1-98f05330a57c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.353153 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c9436e-6da5-4040-a6a6-904add2b86b1" path="/var/lib/kubelet/pods/b5c9436e-6da5-4040-a6a6-904add2b86b1/volumes" Sep 30 05:48:28 crc kubenswrapper[4956]: W0930 05:48:28.390908 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f34a355_be28_482b_a242_f113c5c9fd59.slice/crio-b642c8bfde7bf71b80d7e3dac32d0de64dee4eb4385461daa613bf928a444a66 WatchSource:0}: Error finding container b642c8bfde7bf71b80d7e3dac32d0de64dee4eb4385461daa613bf928a444a66: Status 404 returned error can't find the container with id b642c8bfde7bf71b80d7e3dac32d0de64dee4eb4385461daa613bf928a444a66 Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.399470 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.446961 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f34a355-be28-482b-a242-f113c5c9fd59","Type":"ContainerStarted","Data":"b642c8bfde7bf71b80d7e3dac32d0de64dee4eb4385461daa613bf928a444a66"} Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.448319 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48b3d9a0-6a91-4200-b4e1-98f05330a57c","Type":"ContainerDied","Data":"1dac25edb6d09dd1d3a775c85f567f336045c88294fb28dec7bb8b9a4abf0392"} Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.448361 4956 scope.go:117] "RemoveContainer" containerID="601dbd4a3d7bbb8ba81c524514bd9420298f122143cf7707f0791238548a0ae4" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.448447 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.455092 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerStarted","Data":"0eae743885c8594d01610d8a29dce5f20bc4d7a287e24d95f5650fe987575993"} Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.455161 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.513869 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6764960540000002 podStartE2EDuration="6.513852863s" podCreationTimestamp="2025-09-30 05:48:22 +0000 UTC" firstStartedPulling="2025-09-30 05:48:23.381806461 +0000 UTC m=+1173.708926986" lastFinishedPulling="2025-09-30 05:48:27.21916327 +0000 UTC m=+1177.546283795" observedRunningTime="2025-09-30 05:48:28.481381656 +0000 UTC m=+1178.808502181" watchObservedRunningTime="2025-09-30 05:48:28.513852863 +0000 UTC m=+1178.840973388" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.538658 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.548089 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.558430 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:28 crc kubenswrapper[4956]: E0930 05:48:28.558818 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b3d9a0-6a91-4200-b4e1-98f05330a57c" containerName="nova-scheduler-scheduler" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.558835 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b3d9a0-6a91-4200-b4e1-98f05330a57c" containerName="nova-scheduler-scheduler" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.559042 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b3d9a0-6a91-4200-b4e1-98f05330a57c" containerName="nova-scheduler-scheduler" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.559779 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.568969 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.577521 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.728405 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls28q\" (UniqueName: \"kubernetes.io/projected/0d20a784-4e75-4f0b-bcf2-72ab60418bee-kube-api-access-ls28q\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.728558 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.728604 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-config-data\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.830247 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-config-data\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.830315 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls28q\" (UniqueName: \"kubernetes.io/projected/0d20a784-4e75-4f0b-bcf2-72ab60418bee-kube-api-access-ls28q\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.830428 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.834683 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-config-data\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.835170 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.845843 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls28q\" (UniqueName: \"kubernetes.io/projected/0d20a784-4e75-4f0b-bcf2-72ab60418bee-kube-api-access-ls28q\") pod \"nova-scheduler-0\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " pod="openstack/nova-scheduler-0" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.849628 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:28 crc kubenswrapper[4956]: I0930 05:48:28.895963 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.035836 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-combined-ca-bundle\") pod \"1691e854-b1af-4974-b06a-716b66af43b4\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.036922 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-scripts\") pod \"1691e854-b1af-4974-b06a-716b66af43b4\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.037057 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-config-data\") pod \"1691e854-b1af-4974-b06a-716b66af43b4\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.037085 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4nsb\" (UniqueName: \"kubernetes.io/projected/1691e854-b1af-4974-b06a-716b66af43b4-kube-api-access-p4nsb\") pod \"1691e854-b1af-4974-b06a-716b66af43b4\" (UID: \"1691e854-b1af-4974-b06a-716b66af43b4\") " Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.041656 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-scripts" (OuterVolumeSpecName: "scripts") pod "1691e854-b1af-4974-b06a-716b66af43b4" (UID: "1691e854-b1af-4974-b06a-716b66af43b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.041740 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1691e854-b1af-4974-b06a-716b66af43b4-kube-api-access-p4nsb" (OuterVolumeSpecName: "kube-api-access-p4nsb") pod "1691e854-b1af-4974-b06a-716b66af43b4" (UID: "1691e854-b1af-4974-b06a-716b66af43b4"). InnerVolumeSpecName "kube-api-access-p4nsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.069208 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1691e854-b1af-4974-b06a-716b66af43b4" (UID: "1691e854-b1af-4974-b06a-716b66af43b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.087877 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-config-data" (OuterVolumeSpecName: "config-data") pod "1691e854-b1af-4974-b06a-716b66af43b4" (UID: "1691e854-b1af-4974-b06a-716b66af43b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.140672 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.140696 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4nsb\" (UniqueName: \"kubernetes.io/projected/1691e854-b1af-4974-b06a-716b66af43b4-kube-api-access-p4nsb\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.140706 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.140714 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1691e854-b1af-4974-b06a-716b66af43b4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.184838 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.351320 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dmb5\" (UniqueName: \"kubernetes.io/projected/0f13b350-87fa-4854-ba9f-b70130abcf35-kube-api-access-5dmb5\") pod \"0f13b350-87fa-4854-ba9f-b70130abcf35\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.352611 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-config-data\") pod \"0f13b350-87fa-4854-ba9f-b70130abcf35\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.353364 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-combined-ca-bundle\") pod \"0f13b350-87fa-4854-ba9f-b70130abcf35\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.353422 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f13b350-87fa-4854-ba9f-b70130abcf35-logs\") pod \"0f13b350-87fa-4854-ba9f-b70130abcf35\" (UID: \"0f13b350-87fa-4854-ba9f-b70130abcf35\") " Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.353986 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f13b350-87fa-4854-ba9f-b70130abcf35-logs" (OuterVolumeSpecName: "logs") pod "0f13b350-87fa-4854-ba9f-b70130abcf35" (UID: "0f13b350-87fa-4854-ba9f-b70130abcf35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.357834 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f13b350-87fa-4854-ba9f-b70130abcf35-kube-api-access-5dmb5" (OuterVolumeSpecName: "kube-api-access-5dmb5") pod "0f13b350-87fa-4854-ba9f-b70130abcf35" (UID: "0f13b350-87fa-4854-ba9f-b70130abcf35"). InnerVolumeSpecName "kube-api-access-5dmb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.357961 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f13b350-87fa-4854-ba9f-b70130abcf35-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.357979 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dmb5\" (UniqueName: \"kubernetes.io/projected/0f13b350-87fa-4854-ba9f-b70130abcf35-kube-api-access-5dmb5\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.387706 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-config-data" (OuterVolumeSpecName: "config-data") pod "0f13b350-87fa-4854-ba9f-b70130abcf35" (UID: "0f13b350-87fa-4854-ba9f-b70130abcf35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.389530 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f13b350-87fa-4854-ba9f-b70130abcf35" (UID: "0f13b350-87fa-4854-ba9f-b70130abcf35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.469066 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.469093 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f13b350-87fa-4854-ba9f-b70130abcf35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.477901 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.488709 4956 generic.go:334] "Generic (PLEG): container finished" podID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerID="7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed" exitCode=0 Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.488816 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f13b350-87fa-4854-ba9f-b70130abcf35","Type":"ContainerDied","Data":"7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed"} Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.488865 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f13b350-87fa-4854-ba9f-b70130abcf35","Type":"ContainerDied","Data":"b2f00b063e6d692e0182cc17a1089937fce2fbed961dadfa1df7e5b07f7ba294"} Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.488883 4956 scope.go:117] "RemoveContainer" containerID="7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.489037 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.491752 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d20a784-4e75-4f0b-bcf2-72ab60418bee","Type":"ContainerStarted","Data":"82007b71982ab22cb05e3bde52940313367eaf08e059961489f1f5cc09a59c65"} Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.500372 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" event={"ID":"1691e854-b1af-4974-b06a-716b66af43b4","Type":"ContainerDied","Data":"1befaface2af823f827a17a7c213e598a349aa168a6e84bedf17325ee19dd739"} Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.500420 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1befaface2af823f827a17a7c213e598a349aa168a6e84bedf17325ee19dd739" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.500483 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kb9pc" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.504202 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f34a355-be28-482b-a242-f113c5c9fd59","Type":"ContainerStarted","Data":"0de317ef60c75e49530deccf619e09f72fb828eafb152661c89aa4170abea844"} Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.504247 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f34a355-be28-482b-a242-f113c5c9fd59","Type":"ContainerStarted","Data":"a959ea428a0d9f9f8d0c1946bc11a4f84312bee84d8324d9126555100bf2e1c9"} Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.521428 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 05:48:29 crc kubenswrapper[4956]: E0930 05:48:29.521801 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-log" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.521821 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-log" Sep 30 05:48:29 crc kubenswrapper[4956]: E0930 05:48:29.521855 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-api" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.521860 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-api" Sep 30 05:48:29 crc kubenswrapper[4956]: E0930 05:48:29.521872 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691e854-b1af-4974-b06a-716b66af43b4" containerName="nova-cell1-conductor-db-sync" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.521877 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691e854-b1af-4974-b06a-716b66af43b4" containerName="nova-cell1-conductor-db-sync" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.522055 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-api" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.522071 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691e854-b1af-4974-b06a-716b66af43b4" containerName="nova-cell1-conductor-db-sync" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.522081 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" containerName="nova-api-log" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.522723 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.534000 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.534619 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.551030 4956 scope.go:117] "RemoveContainer" containerID="2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.571294 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0515ec62-950a-4ab8-8462-7030b37609db-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.572028 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.572008727 podStartE2EDuration="2.572008727s" podCreationTimestamp="2025-09-30 05:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:29.53764066 +0000 UTC m=+1179.864761185" watchObservedRunningTime="2025-09-30 05:48:29.572008727 +0000 UTC m=+1179.899129252" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.572468 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bp2p\" (UniqueName: \"kubernetes.io/projected/0515ec62-950a-4ab8-8462-7030b37609db-kube-api-access-5bp2p\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.572666 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0515ec62-950a-4ab8-8462-7030b37609db-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.594151 4956 scope.go:117] "RemoveContainer" containerID="7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed" Sep 30 05:48:29 crc kubenswrapper[4956]: E0930 05:48:29.594685 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed\": container with ID starting with 7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed not found: ID does not exist" containerID="7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.594717 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed"} err="failed to get container status \"7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed\": rpc error: code = NotFound desc = could not find container \"7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed\": container with ID starting with 7542e2d9fa9f495fcbf754e0edc15bc97962f3a772548e5369a37126df8064ed not found: ID does not exist" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.594737 4956 scope.go:117] "RemoveContainer" containerID="2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14" Sep 30 05:48:29 crc kubenswrapper[4956]: E0930 05:48:29.595257 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14\": container with ID starting with 2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14 not found: ID does not exist" containerID="2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.595276 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14"} err="failed to get container status \"2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14\": rpc error: code = NotFound desc = could not find container \"2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14\": container with ID starting with 2a5bc5d3aa969dc4a0ff56ec8b5b0254130c67ca7b718d7472a27c5949fc6b14 not found: ID does not exist" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.613491 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.621592 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.637217 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.643176 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.675877 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0515ec62-950a-4ab8-8462-7030b37609db-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.676040 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8761fc4-7d7b-428f-9987-271c1c79b40e-logs\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.676176 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bp2p\" (UniqueName: \"kubernetes.io/projected/0515ec62-950a-4ab8-8462-7030b37609db-kube-api-access-5bp2p\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.676219 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-762vq\" (UniqueName: \"kubernetes.io/projected/b8761fc4-7d7b-428f-9987-271c1c79b40e-kube-api-access-762vq\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.676284 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-config-data\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.676392 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0515ec62-950a-4ab8-8462-7030b37609db-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.676455 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.679266 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.683877 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.686909 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0515ec62-950a-4ab8-8462-7030b37609db-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.694170 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0515ec62-950a-4ab8-8462-7030b37609db-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.700814 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bp2p\" (UniqueName: \"kubernetes.io/projected/0515ec62-950a-4ab8-8462-7030b37609db-kube-api-access-5bp2p\") pod \"nova-cell1-conductor-0\" (UID: \"0515ec62-950a-4ab8-8462-7030b37609db\") " pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.778344 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-762vq\" (UniqueName: \"kubernetes.io/projected/b8761fc4-7d7b-428f-9987-271c1c79b40e-kube-api-access-762vq\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.778406 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-config-data\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.778466 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.778561 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8761fc4-7d7b-428f-9987-271c1c79b40e-logs\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.779046 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8761fc4-7d7b-428f-9987-271c1c79b40e-logs\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.781622 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-config-data\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.782562 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.792581 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-762vq\" (UniqueName: \"kubernetes.io/projected/b8761fc4-7d7b-428f-9987-271c1c79b40e-kube-api-access-762vq\") pod \"nova-api-0\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " pod="openstack/nova-api-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.903173 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:29 crc kubenswrapper[4956]: I0930 05:48:29.973312 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:30 crc kubenswrapper[4956]: I0930 05:48:30.357928 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f13b350-87fa-4854-ba9f-b70130abcf35" path="/var/lib/kubelet/pods/0f13b350-87fa-4854-ba9f-b70130abcf35/volumes" Sep 30 05:48:30 crc kubenswrapper[4956]: I0930 05:48:30.358851 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b3d9a0-6a91-4200-b4e1-98f05330a57c" path="/var/lib/kubelet/pods/48b3d9a0-6a91-4200-b4e1-98f05330a57c/volumes" Sep 30 05:48:30 crc kubenswrapper[4956]: I0930 05:48:30.421543 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 05:48:30 crc kubenswrapper[4956]: I0930 05:48:30.507019 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:30 crc kubenswrapper[4956]: I0930 05:48:30.521573 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0515ec62-950a-4ab8-8462-7030b37609db","Type":"ContainerStarted","Data":"f6fadee2327a879c68876dacb68fb0bacbcb2687765775ec075385aae2aeb4d7"} Sep 30 05:48:30 crc kubenswrapper[4956]: I0930 05:48:30.523355 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8761fc4-7d7b-428f-9987-271c1c79b40e","Type":"ContainerStarted","Data":"0532237d55ecb29d7b4d008aea5f1430d8a67a6597e945ea360b3f5e7eb5f12e"} Sep 30 05:48:30 crc kubenswrapper[4956]: I0930 05:48:30.527351 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d20a784-4e75-4f0b-bcf2-72ab60418bee","Type":"ContainerStarted","Data":"16163f5ab02a322505f394312fd9f2e58fca953b87746af3abb1e923670509a5"} Sep 30 05:48:30 crc kubenswrapper[4956]: I0930 05:48:30.559047 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.559029164 podStartE2EDuration="2.559029164s" podCreationTimestamp="2025-09-30 05:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:30.546540526 +0000 UTC m=+1180.873661061" watchObservedRunningTime="2025-09-30 05:48:30.559029164 +0000 UTC m=+1180.886149689" Sep 30 05:48:31 crc kubenswrapper[4956]: I0930 05:48:31.539066 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0515ec62-950a-4ab8-8462-7030b37609db","Type":"ContainerStarted","Data":"87d2812fb7f9fd991b558387097660ae92d14ae2fc99ca2c2e1e3c79c30d0b51"} Sep 30 05:48:31 crc kubenswrapper[4956]: I0930 05:48:31.539654 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:31 crc kubenswrapper[4956]: I0930 05:48:31.542981 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8761fc4-7d7b-428f-9987-271c1c79b40e","Type":"ContainerStarted","Data":"47257eb2d258e9da1ffd7d6b50611cb0e28bc20cfd4f8cd46e1d0b753d420f2e"} Sep 30 05:48:31 crc kubenswrapper[4956]: I0930 05:48:31.543013 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8761fc4-7d7b-428f-9987-271c1c79b40e","Type":"ContainerStarted","Data":"30c9eb4dcda0ff80fea1ee0b265875acdc1a91ca929d6bcf90a8bf93767a5109"} Sep 30 05:48:31 crc kubenswrapper[4956]: I0930 05:48:31.563742 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.563718738 podStartE2EDuration="2.563718738s" podCreationTimestamp="2025-09-30 05:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:31.558470346 +0000 UTC m=+1181.885590881" watchObservedRunningTime="2025-09-30 05:48:31.563718738 +0000 UTC m=+1181.890839263" Sep 30 05:48:31 crc kubenswrapper[4956]: I0930 05:48:31.579660 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.579645512 podStartE2EDuration="2.579645512s" podCreationTimestamp="2025-09-30 05:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:31.574818742 +0000 UTC m=+1181.901939287" watchObservedRunningTime="2025-09-30 05:48:31.579645512 +0000 UTC m=+1181.906766037" Sep 30 05:48:32 crc kubenswrapper[4956]: I0930 05:48:32.893318 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 05:48:32 crc kubenswrapper[4956]: I0930 05:48:32.893368 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 05:48:33 crc kubenswrapper[4956]: I0930 05:48:33.896246 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 05:48:37 crc kubenswrapper[4956]: I0930 05:48:37.893975 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 05:48:37 crc kubenswrapper[4956]: I0930 05:48:37.894763 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 05:48:38 crc kubenswrapper[4956]: I0930 05:48:38.896739 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 05:48:38 crc kubenswrapper[4956]: I0930 05:48:38.914344 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 05:48:38 crc kubenswrapper[4956]: I0930 05:48:38.914407 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 05:48:38 crc kubenswrapper[4956]: I0930 05:48:38.925851 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 05:48:39 crc kubenswrapper[4956]: I0930 05:48:39.692997 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 05:48:39 crc kubenswrapper[4956]: I0930 05:48:39.929608 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 05:48:39 crc kubenswrapper[4956]: I0930 05:48:39.974131 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 05:48:39 crc kubenswrapper[4956]: I0930 05:48:39.974438 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 05:48:41 crc kubenswrapper[4956]: I0930 05:48:41.056307 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 05:48:41 crc kubenswrapper[4956]: I0930 05:48:41.056324 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 05:48:47 crc kubenswrapper[4956]: I0930 05:48:47.897612 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 05:48:47 crc kubenswrapper[4956]: I0930 05:48:47.898049 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 05:48:47 crc kubenswrapper[4956]: I0930 05:48:47.902680 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 05:48:47 crc kubenswrapper[4956]: I0930 05:48:47.905896 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 05:48:49 crc kubenswrapper[4956]: I0930 05:48:49.980351 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 05:48:49 crc kubenswrapper[4956]: I0930 05:48:49.981254 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 05:48:49 crc kubenswrapper[4956]: I0930 05:48:49.991200 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 05:48:49 crc kubenswrapper[4956]: I0930 05:48:49.991527 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.714693 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.760196 4956 generic.go:334] "Generic (PLEG): container finished" podID="3c721d55-4875-4026-b74b-28f4afc31f99" containerID="972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f" exitCode=137 Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.760681 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3c721d55-4875-4026-b74b-28f4afc31f99","Type":"ContainerDied","Data":"972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f"} Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.760752 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3c721d55-4875-4026-b74b-28f4afc31f99","Type":"ContainerDied","Data":"7b4f8befea8df448fdd5b4840029c9817371ad90dece0aa050417144e2d31eee"} Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.760772 4956 scope.go:117] "RemoveContainer" containerID="972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.760981 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.761613 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.767609 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.781520 4956 scope.go:117] "RemoveContainer" containerID="972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f" Sep 30 05:48:50 crc kubenswrapper[4956]: E0930 05:48:50.781936 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f\": container with ID starting with 972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f not found: ID does not exist" containerID="972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.781978 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f"} err="failed to get container status \"972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f\": rpc error: code = NotFound desc = could not find container \"972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f\": container with ID starting with 972bab957006e3ef95ce4472993657e1dfa9ae53ba8166ae22ed9897ecb60b9f not found: ID does not exist" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.838741 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7chbh\" (UniqueName: \"kubernetes.io/projected/3c721d55-4875-4026-b74b-28f4afc31f99-kube-api-access-7chbh\") pod \"3c721d55-4875-4026-b74b-28f4afc31f99\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.838816 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-config-data\") pod \"3c721d55-4875-4026-b74b-28f4afc31f99\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.838987 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-combined-ca-bundle\") pod \"3c721d55-4875-4026-b74b-28f4afc31f99\" (UID: \"3c721d55-4875-4026-b74b-28f4afc31f99\") " Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.867457 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c721d55-4875-4026-b74b-28f4afc31f99-kube-api-access-7chbh" (OuterVolumeSpecName: "kube-api-access-7chbh") pod "3c721d55-4875-4026-b74b-28f4afc31f99" (UID: "3c721d55-4875-4026-b74b-28f4afc31f99"). InnerVolumeSpecName "kube-api-access-7chbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.898591 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-config-data" (OuterVolumeSpecName: "config-data") pod "3c721d55-4875-4026-b74b-28f4afc31f99" (UID: "3c721d55-4875-4026-b74b-28f4afc31f99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.943145 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7chbh\" (UniqueName: \"kubernetes.io/projected/3c721d55-4875-4026-b74b-28f4afc31f99-kube-api-access-7chbh\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.943177 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.953172 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c721d55-4875-4026-b74b-28f4afc31f99" (UID: "3c721d55-4875-4026-b74b-28f4afc31f99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.967954 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-82dqm"] Sep 30 05:48:50 crc kubenswrapper[4956]: E0930 05:48:50.968403 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c721d55-4875-4026-b74b-28f4afc31f99" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.968423 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c721d55-4875-4026-b74b-28f4afc31f99" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.968633 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c721d55-4875-4026-b74b-28f4afc31f99" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.969682 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:50 crc kubenswrapper[4956]: I0930 05:48:50.982938 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-82dqm"] Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.047187 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-svc\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.047455 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-649l8\" (UniqueName: \"kubernetes.io/projected/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-kube-api-access-649l8\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.056970 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.057042 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-config\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.057082 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.057244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.057536 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c721d55-4875-4026-b74b-28f4afc31f99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.125355 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.139891 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.152564 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.154704 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.157734 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.157820 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.158422 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.158974 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-649l8\" (UniqueName: \"kubernetes.io/projected/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-kube-api-access-649l8\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.159017 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.159036 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-config\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.159067 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.159134 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.159228 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-svc\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.160250 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-config\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.160340 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-svc\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.160433 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.160606 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.160946 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.165792 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.190248 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-649l8\" (UniqueName: \"kubernetes.io/projected/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-kube-api-access-649l8\") pod \"dnsmasq-dns-54599d8f7-82dqm\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.261163 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9jxg\" (UniqueName: \"kubernetes.io/projected/5b82f755-357a-4b4e-84bd-1712077f17a5-kube-api-access-s9jxg\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.261224 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.261271 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.261307 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.261487 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.321938 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.364090 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.366174 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.366230 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.366312 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.366417 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9jxg\" (UniqueName: \"kubernetes.io/projected/5b82f755-357a-4b4e-84bd-1712077f17a5-kube-api-access-s9jxg\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.370281 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.371299 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.371637 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.373864 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b82f755-357a-4b4e-84bd-1712077f17a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.385045 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9jxg\" (UniqueName: \"kubernetes.io/projected/5b82f755-357a-4b4e-84bd-1712077f17a5-kube-api-access-s9jxg\") pod \"nova-cell1-novncproxy-0\" (UID: \"5b82f755-357a-4b4e-84bd-1712077f17a5\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.487138 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:51 crc kubenswrapper[4956]: I0930 05:48:51.876431 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-82dqm"] Sep 30 05:48:52 crc kubenswrapper[4956]: W0930 05:48:52.035844 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b82f755_357a_4b4e_84bd_1712077f17a5.slice/crio-20727c69e8b70d2b39ba3eb10b98bb76733a7742ab000030308f93ce3c90deb1 WatchSource:0}: Error finding container 20727c69e8b70d2b39ba3eb10b98bb76733a7742ab000030308f93ce3c90deb1: Status 404 returned error can't find the container with id 20727c69e8b70d2b39ba3eb10b98bb76733a7742ab000030308f93ce3c90deb1 Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.037696 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.352847 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c721d55-4875-4026-b74b-28f4afc31f99" path="/var/lib/kubelet/pods/3c721d55-4875-4026-b74b-28f4afc31f99/volumes" Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.780280 4956 generic.go:334] "Generic (PLEG): container finished" podID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" containerID="78506152d99ea2083e4224957f6fb315167fa2dc7c97010edfdf553406f8e7fc" exitCode=0 Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.780345 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" event={"ID":"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7","Type":"ContainerDied","Data":"78506152d99ea2083e4224957f6fb315167fa2dc7c97010edfdf553406f8e7fc"} Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.780371 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" event={"ID":"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7","Type":"ContainerStarted","Data":"d1b3f3098425edd8ff112d35ca5e4e2d74d7164bb8ff3d297ddedf99d260fd7d"} Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.784803 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5b82f755-357a-4b4e-84bd-1712077f17a5","Type":"ContainerStarted","Data":"7986346c24670b4904e73468e59e0565635ca939794959f16470025cec3e677f"} Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.785149 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5b82f755-357a-4b4e-84bd-1712077f17a5","Type":"ContainerStarted","Data":"20727c69e8b70d2b39ba3eb10b98bb76733a7742ab000030308f93ce3c90deb1"} Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.835150 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.8351078090000001 podStartE2EDuration="1.835107809s" podCreationTimestamp="2025-09-30 05:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:52.825812621 +0000 UTC m=+1203.152933146" watchObservedRunningTime="2025-09-30 05:48:52.835107809 +0000 UTC m=+1203.162228334" Sep 30 05:48:52 crc kubenswrapper[4956]: I0930 05:48:52.909098 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.205189 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.338418 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.812153 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-log" containerID="cri-o://30c9eb4dcda0ff80fea1ee0b265875acdc1a91ca929d6bcf90a8bf93767a5109" gracePeriod=30 Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.812426 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" event={"ID":"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7","Type":"ContainerStarted","Data":"fce76c36eca57c3cdbbc9a180515704cccf70830def5459def56dd92abaa6bc2"} Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.812649 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-api" containerID="cri-o://47257eb2d258e9da1ffd7d6b50611cb0e28bc20cfd4f8cd46e1d0b753d420f2e" gracePeriod=30 Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.813259 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.814524 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="ceilometer-central-agent" containerID="cri-o://867fb896f2fe9db31f7881fd9cd2b4f66860db34c0c3ec7a9949cd9a35b36b75" gracePeriod=30 Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.814871 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="sg-core" containerID="cri-o://6ebd9160643d20d0700c5ec18b29036d9246f10d4da1e2c05bed8928cc93b6d6" gracePeriod=30 Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.814842 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="ceilometer-notification-agent" containerID="cri-o://a529e05b04d10fd626daa89a16f879d7035c7cf4b1ca1a8debc1da6631057ed0" gracePeriod=30 Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.816434 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="proxy-httpd" containerID="cri-o://0eae743885c8594d01610d8a29dce5f20bc4d7a287e24d95f5650fe987575993" gracePeriod=30 Sep 30 05:48:53 crc kubenswrapper[4956]: I0930 05:48:53.845686 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" podStartSLOduration=3.845667935 podStartE2EDuration="3.845667935s" podCreationTimestamp="2025-09-30 05:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:53.841656211 +0000 UTC m=+1204.168776746" watchObservedRunningTime="2025-09-30 05:48:53.845667935 +0000 UTC m=+1204.172788460" Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.824166 4956 generic.go:334] "Generic (PLEG): container finished" podID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerID="0eae743885c8594d01610d8a29dce5f20bc4d7a287e24d95f5650fe987575993" exitCode=0 Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.824456 4956 generic.go:334] "Generic (PLEG): container finished" podID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerID="6ebd9160643d20d0700c5ec18b29036d9246f10d4da1e2c05bed8928cc93b6d6" exitCode=2 Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.824466 4956 generic.go:334] "Generic (PLEG): container finished" podID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerID="867fb896f2fe9db31f7881fd9cd2b4f66860db34c0c3ec7a9949cd9a35b36b75" exitCode=0 Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.824503 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerDied","Data":"0eae743885c8594d01610d8a29dce5f20bc4d7a287e24d95f5650fe987575993"} Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.824533 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerDied","Data":"6ebd9160643d20d0700c5ec18b29036d9246f10d4da1e2c05bed8928cc93b6d6"} Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.824543 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerDied","Data":"867fb896f2fe9db31f7881fd9cd2b4f66860db34c0c3ec7a9949cd9a35b36b75"} Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.826049 4956 generic.go:334] "Generic (PLEG): container finished" podID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerID="47257eb2d258e9da1ffd7d6b50611cb0e28bc20cfd4f8cd46e1d0b753d420f2e" exitCode=0 Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.826079 4956 generic.go:334] "Generic (PLEG): container finished" podID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerID="30c9eb4dcda0ff80fea1ee0b265875acdc1a91ca929d6bcf90a8bf93767a5109" exitCode=143 Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.826089 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8761fc4-7d7b-428f-9987-271c1c79b40e","Type":"ContainerDied","Data":"47257eb2d258e9da1ffd7d6b50611cb0e28bc20cfd4f8cd46e1d0b753d420f2e"} Sep 30 05:48:54 crc kubenswrapper[4956]: I0930 05:48:54.826124 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8761fc4-7d7b-428f-9987-271c1c79b40e","Type":"ContainerDied","Data":"30c9eb4dcda0ff80fea1ee0b265875acdc1a91ca929d6bcf90a8bf93767a5109"} Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.235683 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.343905 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-762vq\" (UniqueName: \"kubernetes.io/projected/b8761fc4-7d7b-428f-9987-271c1c79b40e-kube-api-access-762vq\") pod \"b8761fc4-7d7b-428f-9987-271c1c79b40e\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.343988 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8761fc4-7d7b-428f-9987-271c1c79b40e-logs\") pod \"b8761fc4-7d7b-428f-9987-271c1c79b40e\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.344043 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-config-data\") pod \"b8761fc4-7d7b-428f-9987-271c1c79b40e\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.344061 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-combined-ca-bundle\") pod \"b8761fc4-7d7b-428f-9987-271c1c79b40e\" (UID: \"b8761fc4-7d7b-428f-9987-271c1c79b40e\") " Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.344705 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8761fc4-7d7b-428f-9987-271c1c79b40e-logs" (OuterVolumeSpecName: "logs") pod "b8761fc4-7d7b-428f-9987-271c1c79b40e" (UID: "b8761fc4-7d7b-428f-9987-271c1c79b40e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.351476 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8761fc4-7d7b-428f-9987-271c1c79b40e-kube-api-access-762vq" (OuterVolumeSpecName: "kube-api-access-762vq") pod "b8761fc4-7d7b-428f-9987-271c1c79b40e" (UID: "b8761fc4-7d7b-428f-9987-271c1c79b40e"). InnerVolumeSpecName "kube-api-access-762vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.375522 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8761fc4-7d7b-428f-9987-271c1c79b40e" (UID: "b8761fc4-7d7b-428f-9987-271c1c79b40e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.376472 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-config-data" (OuterVolumeSpecName: "config-data") pod "b8761fc4-7d7b-428f-9987-271c1c79b40e" (UID: "b8761fc4-7d7b-428f-9987-271c1c79b40e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.447107 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8761fc4-7d7b-428f-9987-271c1c79b40e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.447163 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.447174 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8761fc4-7d7b-428f-9987-271c1c79b40e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.447185 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-762vq\" (UniqueName: \"kubernetes.io/projected/b8761fc4-7d7b-428f-9987-271c1c79b40e-kube-api-access-762vq\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.848045 4956 generic.go:334] "Generic (PLEG): container finished" podID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerID="a529e05b04d10fd626daa89a16f879d7035c7cf4b1ca1a8debc1da6631057ed0" exitCode=0 Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.848103 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerDied","Data":"a529e05b04d10fd626daa89a16f879d7035c7cf4b1ca1a8debc1da6631057ed0"} Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.851583 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8761fc4-7d7b-428f-9987-271c1c79b40e","Type":"ContainerDied","Data":"0532237d55ecb29d7b4d008aea5f1430d8a67a6597e945ea360b3f5e7eb5f12e"} Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.851624 4956 scope.go:117] "RemoveContainer" containerID="47257eb2d258e9da1ffd7d6b50611cb0e28bc20cfd4f8cd46e1d0b753d420f2e" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.851720 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.890408 4956 scope.go:117] "RemoveContainer" containerID="30c9eb4dcda0ff80fea1ee0b265875acdc1a91ca929d6bcf90a8bf93767a5109" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.898443 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.904409 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.918061 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:55 crc kubenswrapper[4956]: E0930 05:48:55.918508 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-api" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.918525 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-api" Sep 30 05:48:55 crc kubenswrapper[4956]: E0930 05:48:55.918572 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-log" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.918578 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-log" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.918765 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-api" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.918787 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" containerName="nova-api-log" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.921169 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.923907 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.925390 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.926009 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.933096 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:55 crc kubenswrapper[4956]: I0930 05:48:55.986996 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.058757 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-config-data\") pod \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.058835 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-scripts\") pod \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.058871 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-run-httpd\") pod \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.058906 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-ceilometer-tls-certs\") pod \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059008 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmvn5\" (UniqueName: \"kubernetes.io/projected/11cda964-5209-4ff7-8cc2-c9cc77d4a105-kube-api-access-vmvn5\") pod \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059090 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-log-httpd\") pod \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059187 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-combined-ca-bundle\") pod \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059240 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-sg-core-conf-yaml\") pod \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\" (UID: \"11cda964-5209-4ff7-8cc2-c9cc77d4a105\") " Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059478 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059521 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059522 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "11cda964-5209-4ff7-8cc2-c9cc77d4a105" (UID: "11cda964-5209-4ff7-8cc2-c9cc77d4a105"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059557 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059612 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbfbn\" (UniqueName: \"kubernetes.io/projected/696241f2-167e-4e1a-8791-3df5581472fe-kube-api-access-mbfbn\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059656 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696241f2-167e-4e1a-8791-3df5581472fe-logs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059674 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-config-data\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.059717 4956 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.060910 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "11cda964-5209-4ff7-8cc2-c9cc77d4a105" (UID: "11cda964-5209-4ff7-8cc2-c9cc77d4a105"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.065413 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cda964-5209-4ff7-8cc2-c9cc77d4a105-kube-api-access-vmvn5" (OuterVolumeSpecName: "kube-api-access-vmvn5") pod "11cda964-5209-4ff7-8cc2-c9cc77d4a105" (UID: "11cda964-5209-4ff7-8cc2-c9cc77d4a105"). InnerVolumeSpecName "kube-api-access-vmvn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.066140 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-scripts" (OuterVolumeSpecName: "scripts") pod "11cda964-5209-4ff7-8cc2-c9cc77d4a105" (UID: "11cda964-5209-4ff7-8cc2-c9cc77d4a105"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.093906 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "11cda964-5209-4ff7-8cc2-c9cc77d4a105" (UID: "11cda964-5209-4ff7-8cc2-c9cc77d4a105"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.137914 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "11cda964-5209-4ff7-8cc2-c9cc77d4a105" (UID: "11cda964-5209-4ff7-8cc2-c9cc77d4a105"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.156764 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11cda964-5209-4ff7-8cc2-c9cc77d4a105" (UID: "11cda964-5209-4ff7-8cc2-c9cc77d4a105"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.161172 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.161278 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.161433 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbfbn\" (UniqueName: \"kubernetes.io/projected/696241f2-167e-4e1a-8791-3df5581472fe-kube-api-access-mbfbn\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.162298 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696241f2-167e-4e1a-8791-3df5581472fe-logs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.162337 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-config-data\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.162451 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.162534 4956 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.162581 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.162599 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.162850 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696241f2-167e-4e1a-8791-3df5581472fe-logs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.163050 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmvn5\" (UniqueName: \"kubernetes.io/projected/11cda964-5209-4ff7-8cc2-c9cc77d4a105-kube-api-access-vmvn5\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.163078 4956 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/11cda964-5209-4ff7-8cc2-c9cc77d4a105-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.163092 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.165029 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.165464 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-config-data\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.165941 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.175182 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.176841 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbfbn\" (UniqueName: \"kubernetes.io/projected/696241f2-167e-4e1a-8791-3df5581472fe-kube-api-access-mbfbn\") pod \"nova-api-0\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.196588 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-config-data" (OuterVolumeSpecName: "config-data") pod "11cda964-5209-4ff7-8cc2-c9cc77d4a105" (UID: "11cda964-5209-4ff7-8cc2-c9cc77d4a105"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.264672 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cda964-5209-4ff7-8cc2-c9cc77d4a105-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.281211 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.356145 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8761fc4-7d7b-428f-9987-271c1c79b40e" path="/var/lib/kubelet/pods/b8761fc4-7d7b-428f-9987-271c1c79b40e/volumes" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.489001 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.742864 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.866565 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"11cda964-5209-4ff7-8cc2-c9cc77d4a105","Type":"ContainerDied","Data":"a56ae80d109f8efb5431930ceaf93575d8905ca4a272336e75e4e9e9de114b17"} Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.866865 4956 scope.go:117] "RemoveContainer" containerID="0eae743885c8594d01610d8a29dce5f20bc4d7a287e24d95f5650fe987575993" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.866604 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.869727 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"696241f2-167e-4e1a-8791-3df5581472fe","Type":"ContainerStarted","Data":"ef9a5fc24483688b723dc7892965bf5c5322739fbaa314dc2d46b896b3f3b604"} Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.895405 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.910518 4956 scope.go:117] "RemoveContainer" containerID="6ebd9160643d20d0700c5ec18b29036d9246f10d4da1e2c05bed8928cc93b6d6" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.919685 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958189 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:56 crc kubenswrapper[4956]: E0930 05:48:56.958649 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="ceilometer-central-agent" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958667 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="ceilometer-central-agent" Sep 30 05:48:56 crc kubenswrapper[4956]: E0930 05:48:56.958701 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="sg-core" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958708 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="sg-core" Sep 30 05:48:56 crc kubenswrapper[4956]: E0930 05:48:56.958726 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="ceilometer-notification-agent" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958732 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="ceilometer-notification-agent" Sep 30 05:48:56 crc kubenswrapper[4956]: E0930 05:48:56.958744 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="proxy-httpd" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958750 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="proxy-httpd" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958944 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="ceilometer-central-agent" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958978 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="ceilometer-notification-agent" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958987 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="sg-core" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.958998 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" containerName="proxy-httpd" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.960919 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.963064 4956 scope.go:117] "RemoveContainer" containerID="a529e05b04d10fd626daa89a16f879d7035c7cf4b1ca1a8debc1da6631057ed0" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.963285 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.963299 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.963436 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.972967 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:56 crc kubenswrapper[4956]: I0930 05:48:56.989299 4956 scope.go:117] "RemoveContainer" containerID="867fb896f2fe9db31f7881fd9cd2b4f66860db34c0c3ec7a9949cd9a35b36b75" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.080899 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-log-httpd\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.080942 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.080968 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-scripts\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.081028 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.081064 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvfjw\" (UniqueName: \"kubernetes.io/projected/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-kube-api-access-dvfjw\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.081175 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.081212 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-run-httpd\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.081244 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-config-data\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183316 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183411 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-run-httpd\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183459 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-config-data\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183509 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-log-httpd\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183547 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183576 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-scripts\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183606 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183653 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvfjw\" (UniqueName: \"kubernetes.io/projected/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-kube-api-access-dvfjw\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.183874 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-run-httpd\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.184556 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-log-httpd\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.187809 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.188534 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.189002 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.189531 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-config-data\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.195548 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-scripts\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.203043 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvfjw\" (UniqueName: \"kubernetes.io/projected/c3e1cc75-7672-44f7-acb3-69ef2ae910d1-kube-api-access-dvfjw\") pod \"ceilometer-0\" (UID: \"c3e1cc75-7672-44f7-acb3-69ef2ae910d1\") " pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.282238 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.761564 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 05:48:57 crc kubenswrapper[4956]: W0930 05:48:57.769703 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3e1cc75_7672_44f7_acb3_69ef2ae910d1.slice/crio-00d5400262fbf9152cc88c1c1bba4999f5dbe87b967d1442352ac7e7f4b13f27 WatchSource:0}: Error finding container 00d5400262fbf9152cc88c1c1bba4999f5dbe87b967d1442352ac7e7f4b13f27: Status 404 returned error can't find the container with id 00d5400262fbf9152cc88c1c1bba4999f5dbe87b967d1442352ac7e7f4b13f27 Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.882861 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"696241f2-167e-4e1a-8791-3df5581472fe","Type":"ContainerStarted","Data":"58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df"} Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.882902 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"696241f2-167e-4e1a-8791-3df5581472fe","Type":"ContainerStarted","Data":"3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8"} Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.886379 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3e1cc75-7672-44f7-acb3-69ef2ae910d1","Type":"ContainerStarted","Data":"00d5400262fbf9152cc88c1c1bba4999f5dbe87b967d1442352ac7e7f4b13f27"} Sep 30 05:48:57 crc kubenswrapper[4956]: I0930 05:48:57.906417 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.906397626 podStartE2EDuration="2.906397626s" podCreationTimestamp="2025-09-30 05:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:48:57.904618991 +0000 UTC m=+1208.231739506" watchObservedRunningTime="2025-09-30 05:48:57.906397626 +0000 UTC m=+1208.233518151" Sep 30 05:48:58 crc kubenswrapper[4956]: I0930 05:48:58.354551 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cda964-5209-4ff7-8cc2-c9cc77d4a105" path="/var/lib/kubelet/pods/11cda964-5209-4ff7-8cc2-c9cc77d4a105/volumes" Sep 30 05:48:58 crc kubenswrapper[4956]: I0930 05:48:58.900892 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3e1cc75-7672-44f7-acb3-69ef2ae910d1","Type":"ContainerStarted","Data":"1772b7de38f616e078b5ec28d69612ce4551b1eedf543f78521f5081e7449338"} Sep 30 05:48:58 crc kubenswrapper[4956]: I0930 05:48:58.900954 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3e1cc75-7672-44f7-acb3-69ef2ae910d1","Type":"ContainerStarted","Data":"dc0b6b6aca1544464a190fc02c05291d7a0b38fdb3d9edb7e31eeb7370783b7a"} Sep 30 05:48:59 crc kubenswrapper[4956]: I0930 05:48:59.912611 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3e1cc75-7672-44f7-acb3-69ef2ae910d1","Type":"ContainerStarted","Data":"4117a6738f4f1bc5f5fb7acc156bd87615c80b1b2989b8b086591c23c5773304"} Sep 30 05:49:00 crc kubenswrapper[4956]: I0930 05:49:00.929492 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3e1cc75-7672-44f7-acb3-69ef2ae910d1","Type":"ContainerStarted","Data":"7997b49ad3e275c13c95a3523db7d74e588803a4673ce365a8976b062d7c81b3"} Sep 30 05:49:00 crc kubenswrapper[4956]: I0930 05:49:00.931453 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 05:49:00 crc kubenswrapper[4956]: I0930 05:49:00.961401 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.326992598 podStartE2EDuration="4.961384029s" podCreationTimestamp="2025-09-30 05:48:56 +0000 UTC" firstStartedPulling="2025-09-30 05:48:57.773097891 +0000 UTC m=+1208.100218416" lastFinishedPulling="2025-09-30 05:49:00.407489312 +0000 UTC m=+1210.734609847" observedRunningTime="2025-09-30 05:49:00.949024346 +0000 UTC m=+1211.276144871" watchObservedRunningTime="2025-09-30 05:49:00.961384029 +0000 UTC m=+1211.288504554" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.324892 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.403797 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-rdwj5"] Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.404034 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" podUID="00eae5e6-c773-4bd9-af0f-bc17e1658996" containerName="dnsmasq-dns" containerID="cri-o://261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c" gracePeriod=10 Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.488353 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.506350 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.920514 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.940948 4956 generic.go:334] "Generic (PLEG): container finished" podID="00eae5e6-c773-4bd9-af0f-bc17e1658996" containerID="261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c" exitCode=0 Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.941965 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.942390 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" event={"ID":"00eae5e6-c773-4bd9-af0f-bc17e1658996","Type":"ContainerDied","Data":"261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c"} Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.942418 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-rdwj5" event={"ID":"00eae5e6-c773-4bd9-af0f-bc17e1658996","Type":"ContainerDied","Data":"8c18c366a66f263d5b322b879a82ae850d094cdc98bde4340117d6a8c5a1abbc"} Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.942434 4956 scope.go:117] "RemoveContainer" containerID="261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.972701 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.973380 4956 scope.go:117] "RemoveContainer" containerID="4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9" Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.978667 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-svc\") pod \"00eae5e6-c773-4bd9-af0f-bc17e1658996\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.978769 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-config\") pod \"00eae5e6-c773-4bd9-af0f-bc17e1658996\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.978795 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-nb\") pod \"00eae5e6-c773-4bd9-af0f-bc17e1658996\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.978835 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-swift-storage-0\") pod \"00eae5e6-c773-4bd9-af0f-bc17e1658996\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.978870 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-sb\") pod \"00eae5e6-c773-4bd9-af0f-bc17e1658996\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.978951 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgwnl\" (UniqueName: \"kubernetes.io/projected/00eae5e6-c773-4bd9-af0f-bc17e1658996-kube-api-access-mgwnl\") pod \"00eae5e6-c773-4bd9-af0f-bc17e1658996\" (UID: \"00eae5e6-c773-4bd9-af0f-bc17e1658996\") " Sep 30 05:49:01 crc kubenswrapper[4956]: I0930 05:49:01.991314 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00eae5e6-c773-4bd9-af0f-bc17e1658996-kube-api-access-mgwnl" (OuterVolumeSpecName: "kube-api-access-mgwnl") pod "00eae5e6-c773-4bd9-af0f-bc17e1658996" (UID: "00eae5e6-c773-4bd9-af0f-bc17e1658996"). InnerVolumeSpecName "kube-api-access-mgwnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.042146 4956 scope.go:117] "RemoveContainer" containerID="261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c" Sep 30 05:49:02 crc kubenswrapper[4956]: E0930 05:49:02.060960 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c\": container with ID starting with 261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c not found: ID does not exist" containerID="261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.061015 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c"} err="failed to get container status \"261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c\": rpc error: code = NotFound desc = could not find container \"261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c\": container with ID starting with 261355b4edb38af3a6cacd5747db43702b0621a9369d04534caab2abf86e8f1c not found: ID does not exist" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.061055 4956 scope.go:117] "RemoveContainer" containerID="4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9" Sep 30 05:49:02 crc kubenswrapper[4956]: E0930 05:49:02.067921 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9\": container with ID starting with 4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9 not found: ID does not exist" containerID="4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.067979 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9"} err="failed to get container status \"4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9\": rpc error: code = NotFound desc = could not find container \"4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9\": container with ID starting with 4a88d58a0a5755f56a3a308a61f29d407e864a944db3025e57981033e2e28bb9 not found: ID does not exist" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.068282 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-config" (OuterVolumeSpecName: "config") pod "00eae5e6-c773-4bd9-af0f-bc17e1658996" (UID: "00eae5e6-c773-4bd9-af0f-bc17e1658996"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.080826 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00eae5e6-c773-4bd9-af0f-bc17e1658996" (UID: "00eae5e6-c773-4bd9-af0f-bc17e1658996"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.087503 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.087531 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.087541 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgwnl\" (UniqueName: \"kubernetes.io/projected/00eae5e6-c773-4bd9-af0f-bc17e1658996-kube-api-access-mgwnl\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.134718 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00eae5e6-c773-4bd9-af0f-bc17e1658996" (UID: "00eae5e6-c773-4bd9-af0f-bc17e1658996"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.145743 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vvw9m"] Sep 30 05:49:02 crc kubenswrapper[4956]: E0930 05:49:02.146297 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00eae5e6-c773-4bd9-af0f-bc17e1658996" containerName="dnsmasq-dns" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.146316 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="00eae5e6-c773-4bd9-af0f-bc17e1658996" containerName="dnsmasq-dns" Sep 30 05:49:02 crc kubenswrapper[4956]: E0930 05:49:02.146361 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00eae5e6-c773-4bd9-af0f-bc17e1658996" containerName="init" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.146370 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="00eae5e6-c773-4bd9-af0f-bc17e1658996" containerName="init" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.146573 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="00eae5e6-c773-4bd9-af0f-bc17e1658996" containerName="dnsmasq-dns" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.147270 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.147581 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00eae5e6-c773-4bd9-af0f-bc17e1658996" (UID: "00eae5e6-c773-4bd9-af0f-bc17e1658996"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.151551 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.152154 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.156577 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00eae5e6-c773-4bd9-af0f-bc17e1658996" (UID: "00eae5e6-c773-4bd9-af0f-bc17e1658996"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.158535 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vvw9m"] Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.190444 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.190477 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.190488 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00eae5e6-c773-4bd9-af0f-bc17e1658996-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.281840 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-rdwj5"] Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.289450 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-rdwj5"] Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.292389 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkhz4\" (UniqueName: \"kubernetes.io/projected/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-kube-api-access-zkhz4\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.292446 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-config-data\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.292500 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.292531 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-scripts\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.355153 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00eae5e6-c773-4bd9-af0f-bc17e1658996" path="/var/lib/kubelet/pods/00eae5e6-c773-4bd9-af0f-bc17e1658996/volumes" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.394126 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkhz4\" (UniqueName: \"kubernetes.io/projected/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-kube-api-access-zkhz4\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.394414 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-config-data\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.394527 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.394618 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-scripts\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.399237 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.399758 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-scripts\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.399955 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-config-data\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.411970 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkhz4\" (UniqueName: \"kubernetes.io/projected/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-kube-api-access-zkhz4\") pod \"nova-cell1-cell-mapping-vvw9m\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.464895 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:02 crc kubenswrapper[4956]: W0930 05:49:02.973194 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a7a0b4a_2f92_4af7_a301_b31e1fbe39a8.slice/crio-d63925a2d60ba05654c63cc9d789ba91093b3b81c8847bbeb1fb44699f4ac31a WatchSource:0}: Error finding container d63925a2d60ba05654c63cc9d789ba91093b3b81c8847bbeb1fb44699f4ac31a: Status 404 returned error can't find the container with id d63925a2d60ba05654c63cc9d789ba91093b3b81c8847bbeb1fb44699f4ac31a Sep 30 05:49:02 crc kubenswrapper[4956]: I0930 05:49:02.981298 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vvw9m"] Sep 30 05:49:03 crc kubenswrapper[4956]: I0930 05:49:03.972677 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vvw9m" event={"ID":"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8","Type":"ContainerStarted","Data":"5d866abb3a48b8d48a5d8a1713bfc515487fe6f85e8335d457c23d46a78a093a"} Sep 30 05:49:03 crc kubenswrapper[4956]: I0930 05:49:03.973055 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vvw9m" event={"ID":"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8","Type":"ContainerStarted","Data":"d63925a2d60ba05654c63cc9d789ba91093b3b81c8847bbeb1fb44699f4ac31a"} Sep 30 05:49:03 crc kubenswrapper[4956]: I0930 05:49:03.992355 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vvw9m" podStartSLOduration=1.992336576 podStartE2EDuration="1.992336576s" podCreationTimestamp="2025-09-30 05:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:49:03.986921988 +0000 UTC m=+1214.314042533" watchObservedRunningTime="2025-09-30 05:49:03.992336576 +0000 UTC m=+1214.319457101" Sep 30 05:49:06 crc kubenswrapper[4956]: I0930 05:49:06.307224 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 05:49:06 crc kubenswrapper[4956]: I0930 05:49:06.307807 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 05:49:07 crc kubenswrapper[4956]: I0930 05:49:07.327684 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.217:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 05:49:07 crc kubenswrapper[4956]: I0930 05:49:07.328011 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 05:49:09 crc kubenswrapper[4956]: I0930 05:49:09.021984 4956 generic.go:334] "Generic (PLEG): container finished" podID="6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" containerID="5d866abb3a48b8d48a5d8a1713bfc515487fe6f85e8335d457c23d46a78a093a" exitCode=0 Sep 30 05:49:09 crc kubenswrapper[4956]: I0930 05:49:09.022319 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vvw9m" event={"ID":"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8","Type":"ContainerDied","Data":"5d866abb3a48b8d48a5d8a1713bfc515487fe6f85e8335d457c23d46a78a093a"} Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.518916 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.573792 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-combined-ca-bundle\") pod \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.573918 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkhz4\" (UniqueName: \"kubernetes.io/projected/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-kube-api-access-zkhz4\") pod \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.573954 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-scripts\") pod \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.574017 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-config-data\") pod \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\" (UID: \"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8\") " Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.579689 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-kube-api-access-zkhz4" (OuterVolumeSpecName: "kube-api-access-zkhz4") pod "6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" (UID: "6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8"). InnerVolumeSpecName "kube-api-access-zkhz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.580605 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-scripts" (OuterVolumeSpecName: "scripts") pod "6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" (UID: "6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.603006 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" (UID: "6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.615697 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-config-data" (OuterVolumeSpecName: "config-data") pod "6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" (UID: "6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.676753 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.676788 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.676805 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkhz4\" (UniqueName: \"kubernetes.io/projected/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-kube-api-access-zkhz4\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.676818 4956 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.930276 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.930753 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0d20a784-4e75-4f0b-bcf2-72ab60418bee" containerName="nova-scheduler-scheduler" containerID="cri-o://16163f5ab02a322505f394312fd9f2e58fca953b87746af3abb1e923670509a5" gracePeriod=30 Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.951905 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.952354 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-log" containerID="cri-o://3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8" gracePeriod=30 Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.952434 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-api" containerID="cri-o://58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df" gracePeriod=30 Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.960283 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.960526 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-log" containerID="cri-o://a959ea428a0d9f9f8d0c1946bc11a4f84312bee84d8324d9126555100bf2e1c9" gracePeriod=30 Sep 30 05:49:10 crc kubenswrapper[4956]: I0930 05:49:10.960686 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-metadata" containerID="cri-o://0de317ef60c75e49530deccf619e09f72fb828eafb152661c89aa4170abea844" gracePeriod=30 Sep 30 05:49:11 crc kubenswrapper[4956]: I0930 05:49:11.052509 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vvw9m" event={"ID":"6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8","Type":"ContainerDied","Data":"d63925a2d60ba05654c63cc9d789ba91093b3b81c8847bbeb1fb44699f4ac31a"} Sep 30 05:49:11 crc kubenswrapper[4956]: I0930 05:49:11.052549 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d63925a2d60ba05654c63cc9d789ba91093b3b81c8847bbeb1fb44699f4ac31a" Sep 30 05:49:11 crc kubenswrapper[4956]: I0930 05:49:11.052637 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vvw9m" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.085001 4956 generic.go:334] "Generic (PLEG): container finished" podID="696241f2-167e-4e1a-8791-3df5581472fe" containerID="3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8" exitCode=143 Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.085077 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"696241f2-167e-4e1a-8791-3df5581472fe","Type":"ContainerDied","Data":"3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8"} Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.087848 4956 generic.go:334] "Generic (PLEG): container finished" podID="2f34a355-be28-482b-a242-f113c5c9fd59" containerID="0de317ef60c75e49530deccf619e09f72fb828eafb152661c89aa4170abea844" exitCode=0 Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.087886 4956 generic.go:334] "Generic (PLEG): container finished" podID="2f34a355-be28-482b-a242-f113c5c9fd59" containerID="a959ea428a0d9f9f8d0c1946bc11a4f84312bee84d8324d9126555100bf2e1c9" exitCode=143 Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.087919 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f34a355-be28-482b-a242-f113c5c9fd59","Type":"ContainerDied","Data":"0de317ef60c75e49530deccf619e09f72fb828eafb152661c89aa4170abea844"} Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.087954 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f34a355-be28-482b-a242-f113c5c9fd59","Type":"ContainerDied","Data":"a959ea428a0d9f9f8d0c1946bc11a4f84312bee84d8324d9126555100bf2e1c9"} Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.345002 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.425035 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f34a355-be28-482b-a242-f113c5c9fd59-logs\") pod \"2f34a355-be28-482b-a242-f113c5c9fd59\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.425095 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wff9k\" (UniqueName: \"kubernetes.io/projected/2f34a355-be28-482b-a242-f113c5c9fd59-kube-api-access-wff9k\") pod \"2f34a355-be28-482b-a242-f113c5c9fd59\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.425157 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-combined-ca-bundle\") pod \"2f34a355-be28-482b-a242-f113c5c9fd59\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.425206 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-config-data\") pod \"2f34a355-be28-482b-a242-f113c5c9fd59\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.425254 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-nova-metadata-tls-certs\") pod \"2f34a355-be28-482b-a242-f113c5c9fd59\" (UID: \"2f34a355-be28-482b-a242-f113c5c9fd59\") " Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.430245 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f34a355-be28-482b-a242-f113c5c9fd59-logs" (OuterVolumeSpecName: "logs") pod "2f34a355-be28-482b-a242-f113c5c9fd59" (UID: "2f34a355-be28-482b-a242-f113c5c9fd59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.445878 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f34a355-be28-482b-a242-f113c5c9fd59-kube-api-access-wff9k" (OuterVolumeSpecName: "kube-api-access-wff9k") pod "2f34a355-be28-482b-a242-f113c5c9fd59" (UID: "2f34a355-be28-482b-a242-f113c5c9fd59"). InnerVolumeSpecName "kube-api-access-wff9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.469008 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-config-data" (OuterVolumeSpecName: "config-data") pod "2f34a355-be28-482b-a242-f113c5c9fd59" (UID: "2f34a355-be28-482b-a242-f113c5c9fd59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.472335 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f34a355-be28-482b-a242-f113c5c9fd59" (UID: "2f34a355-be28-482b-a242-f113c5c9fd59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.513141 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2f34a355-be28-482b-a242-f113c5c9fd59" (UID: "2f34a355-be28-482b-a242-f113c5c9fd59"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.527590 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f34a355-be28-482b-a242-f113c5c9fd59-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.527762 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wff9k\" (UniqueName: \"kubernetes.io/projected/2f34a355-be28-482b-a242-f113c5c9fd59-kube-api-access-wff9k\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.527818 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.527868 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:12 crc kubenswrapper[4956]: I0930 05:49:12.527918 4956 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f34a355-be28-482b-a242-f113c5c9fd59-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.016524 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.038915 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696241f2-167e-4e1a-8791-3df5581472fe-logs\") pod \"696241f2-167e-4e1a-8791-3df5581472fe\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.042050 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbfbn\" (UniqueName: \"kubernetes.io/projected/696241f2-167e-4e1a-8791-3df5581472fe-kube-api-access-mbfbn\") pod \"696241f2-167e-4e1a-8791-3df5581472fe\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.040044 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696241f2-167e-4e1a-8791-3df5581472fe-logs" (OuterVolumeSpecName: "logs") pod "696241f2-167e-4e1a-8791-3df5581472fe" (UID: "696241f2-167e-4e1a-8791-3df5581472fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.042184 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-public-tls-certs\") pod \"696241f2-167e-4e1a-8791-3df5581472fe\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.042277 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-config-data\") pod \"696241f2-167e-4e1a-8791-3df5581472fe\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.042322 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-internal-tls-certs\") pod \"696241f2-167e-4e1a-8791-3df5581472fe\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.042390 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-combined-ca-bundle\") pod \"696241f2-167e-4e1a-8791-3df5581472fe\" (UID: \"696241f2-167e-4e1a-8791-3df5581472fe\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.042974 4956 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696241f2-167e-4e1a-8791-3df5581472fe-logs\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.047332 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696241f2-167e-4e1a-8791-3df5581472fe-kube-api-access-mbfbn" (OuterVolumeSpecName: "kube-api-access-mbfbn") pod "696241f2-167e-4e1a-8791-3df5581472fe" (UID: "696241f2-167e-4e1a-8791-3df5581472fe"). InnerVolumeSpecName "kube-api-access-mbfbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.078318 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-config-data" (OuterVolumeSpecName: "config-data") pod "696241f2-167e-4e1a-8791-3df5581472fe" (UID: "696241f2-167e-4e1a-8791-3df5581472fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.110843 4956 generic.go:334] "Generic (PLEG): container finished" podID="0d20a784-4e75-4f0b-bcf2-72ab60418bee" containerID="16163f5ab02a322505f394312fd9f2e58fca953b87746af3abb1e923670509a5" exitCode=0 Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.110898 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d20a784-4e75-4f0b-bcf2-72ab60418bee","Type":"ContainerDied","Data":"16163f5ab02a322505f394312fd9f2e58fca953b87746af3abb1e923670509a5"} Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.118138 4956 generic.go:334] "Generic (PLEG): container finished" podID="696241f2-167e-4e1a-8791-3df5581472fe" containerID="58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df" exitCode=0 Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.118380 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.118390 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"696241f2-167e-4e1a-8791-3df5581472fe","Type":"ContainerDied","Data":"58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df"} Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.118416 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"696241f2-167e-4e1a-8791-3df5581472fe","Type":"ContainerDied","Data":"ef9a5fc24483688b723dc7892965bf5c5322739fbaa314dc2d46b896b3f3b604"} Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.118432 4956 scope.go:117] "RemoveContainer" containerID="58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.121504 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f34a355-be28-482b-a242-f113c5c9fd59","Type":"ContainerDied","Data":"b642c8bfde7bf71b80d7e3dac32d0de64dee4eb4385461daa613bf928a444a66"} Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.121584 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.127834 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "696241f2-167e-4e1a-8791-3df5581472fe" (UID: "696241f2-167e-4e1a-8791-3df5581472fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.128240 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "696241f2-167e-4e1a-8791-3df5581472fe" (UID: "696241f2-167e-4e1a-8791-3df5581472fe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.128770 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "696241f2-167e-4e1a-8791-3df5581472fe" (UID: "696241f2-167e-4e1a-8791-3df5581472fe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.148343 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.148379 4956 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.148391 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.148401 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbfbn\" (UniqueName: \"kubernetes.io/projected/696241f2-167e-4e1a-8791-3df5581472fe-kube-api-access-mbfbn\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.148410 4956 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696241f2-167e-4e1a-8791-3df5581472fe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.157999 4956 scope.go:117] "RemoveContainer" containerID="3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.164547 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.197711 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.203017 4956 scope.go:117] "RemoveContainer" containerID="58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df" Sep 30 05:49:13 crc kubenswrapper[4956]: E0930 05:49:13.204704 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df\": container with ID starting with 58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df not found: ID does not exist" containerID="58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.204740 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df"} err="failed to get container status \"58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df\": rpc error: code = NotFound desc = could not find container \"58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df\": container with ID starting with 58b972ad060657048a4b6b64d1deb32ffcd662ac57f335bc9fbfc986ddf3a9df not found: ID does not exist" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.204764 4956 scope.go:117] "RemoveContainer" containerID="3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.205008 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:49:13 crc kubenswrapper[4956]: E0930 05:49:13.205236 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8\": container with ID starting with 3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8 not found: ID does not exist" containerID="3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.205272 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8"} err="failed to get container status \"3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8\": rpc error: code = NotFound desc = could not find container \"3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8\": container with ID starting with 3d21b42be0b31a5e7e07a089eca04b2f4a0e016df52ebaca46fe00786c1e4ea8 not found: ID does not exist" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.205290 4956 scope.go:117] "RemoveContainer" containerID="0de317ef60c75e49530deccf619e09f72fb828eafb152661c89aa4170abea844" Sep 30 05:49:13 crc kubenswrapper[4956]: E0930 05:49:13.206346 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-log" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.206370 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-log" Sep 30 05:49:13 crc kubenswrapper[4956]: E0930 05:49:13.206521 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-log" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.206536 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-log" Sep 30 05:49:13 crc kubenswrapper[4956]: E0930 05:49:13.206548 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" containerName="nova-manage" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.206554 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" containerName="nova-manage" Sep 30 05:49:13 crc kubenswrapper[4956]: E0930 05:49:13.206574 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-api" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.206580 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-api" Sep 30 05:49:13 crc kubenswrapper[4956]: E0930 05:49:13.206598 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-metadata" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.206606 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-metadata" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.207053 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-log" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.207069 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" containerName="nova-manage" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.207083 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-api" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.207223 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" containerName="nova-metadata-metadata" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.207265 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="696241f2-167e-4e1a-8791-3df5581472fe" containerName="nova-api-log" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.208491 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.213954 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.216311 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.220400 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.236689 4956 scope.go:117] "RemoveContainer" containerID="a959ea428a0d9f9f8d0c1946bc11a4f84312bee84d8324d9126555100bf2e1c9" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.250385 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0effab61-b755-4bd2-afb3-71cdf7983dc3-logs\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.250444 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.250483 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-config-data\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.250715 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.250843 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpn7k\" (UniqueName: \"kubernetes.io/projected/0effab61-b755-4bd2-afb3-71cdf7983dc3-kube-api-access-qpn7k\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.265249 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.351757 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-combined-ca-bundle\") pod \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.351798 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-config-data\") pod \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.351929 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls28q\" (UniqueName: \"kubernetes.io/projected/0d20a784-4e75-4f0b-bcf2-72ab60418bee-kube-api-access-ls28q\") pod \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\" (UID: \"0d20a784-4e75-4f0b-bcf2-72ab60418bee\") " Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.352094 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpn7k\" (UniqueName: \"kubernetes.io/projected/0effab61-b755-4bd2-afb3-71cdf7983dc3-kube-api-access-qpn7k\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.352204 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0effab61-b755-4bd2-afb3-71cdf7983dc3-logs\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.352240 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.352288 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-config-data\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.352343 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.354473 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0effab61-b755-4bd2-afb3-71cdf7983dc3-logs\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.355273 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d20a784-4e75-4f0b-bcf2-72ab60418bee-kube-api-access-ls28q" (OuterVolumeSpecName: "kube-api-access-ls28q") pod "0d20a784-4e75-4f0b-bcf2-72ab60418bee" (UID: "0d20a784-4e75-4f0b-bcf2-72ab60418bee"). InnerVolumeSpecName "kube-api-access-ls28q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.356429 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.356484 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.358041 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0effab61-b755-4bd2-afb3-71cdf7983dc3-config-data\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.368428 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpn7k\" (UniqueName: \"kubernetes.io/projected/0effab61-b755-4bd2-afb3-71cdf7983dc3-kube-api-access-qpn7k\") pod \"nova-metadata-0\" (UID: \"0effab61-b755-4bd2-afb3-71cdf7983dc3\") " pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.385467 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d20a784-4e75-4f0b-bcf2-72ab60418bee" (UID: "0d20a784-4e75-4f0b-bcf2-72ab60418bee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.388030 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-config-data" (OuterVolumeSpecName: "config-data") pod "0d20a784-4e75-4f0b-bcf2-72ab60418bee" (UID: "0d20a784-4e75-4f0b-bcf2-72ab60418bee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.456976 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls28q\" (UniqueName: \"kubernetes.io/projected/0d20a784-4e75-4f0b-bcf2-72ab60418bee-kube-api-access-ls28q\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.457039 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.457059 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d20a784-4e75-4f0b-bcf2-72ab60418bee-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.534767 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.535845 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.556499 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.568185 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 05:49:13 crc kubenswrapper[4956]: E0930 05:49:13.568815 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d20a784-4e75-4f0b-bcf2-72ab60418bee" containerName="nova-scheduler-scheduler" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.568839 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d20a784-4e75-4f0b-bcf2-72ab60418bee" containerName="nova-scheduler-scheduler" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.569231 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d20a784-4e75-4f0b-bcf2-72ab60418bee" containerName="nova-scheduler-scheduler" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.570878 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.573750 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.574030 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.585866 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.588728 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.663910 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jg7r\" (UniqueName: \"kubernetes.io/projected/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-kube-api-access-5jg7r\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.664461 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.664510 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.664563 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.664588 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-logs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.664659 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-config-data\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.765911 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-config-data\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.766016 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jg7r\" (UniqueName: \"kubernetes.io/projected/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-kube-api-access-5jg7r\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.766103 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.766188 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.766282 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.766319 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-logs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.767038 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-logs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.771439 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.778657 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.786390 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-config-data\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.788577 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jg7r\" (UniqueName: \"kubernetes.io/projected/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-kube-api-access-5jg7r\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.802449 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a7e115a-efb9-4ea4-aed6-efa6d4b80203-public-tls-certs\") pod \"nova-api-0\" (UID: \"6a7e115a-efb9-4ea4-aed6-efa6d4b80203\") " pod="openstack/nova-api-0" Sep 30 05:49:13 crc kubenswrapper[4956]: I0930 05:49:13.989578 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.065192 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.146355 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0effab61-b755-4bd2-afb3-71cdf7983dc3","Type":"ContainerStarted","Data":"f535cb8c3457bea18da96b48c7713017c3619528d60cdde532ca7da2fc7b1f8c"} Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.149955 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d20a784-4e75-4f0b-bcf2-72ab60418bee","Type":"ContainerDied","Data":"82007b71982ab22cb05e3bde52940313367eaf08e059961489f1f5cc09a59c65"} Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.149994 4956 scope.go:117] "RemoveContainer" containerID="16163f5ab02a322505f394312fd9f2e58fca953b87746af3abb1e923670509a5" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.150153 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.198287 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.215880 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.234092 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.235630 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.240405 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.244735 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.275835 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c3b340-db8b-4ec8-8554-556d914309a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.275923 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5g4j\" (UniqueName: \"kubernetes.io/projected/94c3b340-db8b-4ec8-8554-556d914309a2-kube-api-access-m5g4j\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.276066 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c3b340-db8b-4ec8-8554-556d914309a2-config-data\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.352665 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d20a784-4e75-4f0b-bcf2-72ab60418bee" path="/var/lib/kubelet/pods/0d20a784-4e75-4f0b-bcf2-72ab60418bee/volumes" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.353597 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f34a355-be28-482b-a242-f113c5c9fd59" path="/var/lib/kubelet/pods/2f34a355-be28-482b-a242-f113c5c9fd59/volumes" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.354550 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696241f2-167e-4e1a-8791-3df5581472fe" path="/var/lib/kubelet/pods/696241f2-167e-4e1a-8791-3df5581472fe/volumes" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.378514 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c3b340-db8b-4ec8-8554-556d914309a2-config-data\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.378642 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c3b340-db8b-4ec8-8554-556d914309a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.378714 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5g4j\" (UniqueName: \"kubernetes.io/projected/94c3b340-db8b-4ec8-8554-556d914309a2-kube-api-access-m5g4j\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.383450 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c3b340-db8b-4ec8-8554-556d914309a2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.383872 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c3b340-db8b-4ec8-8554-556d914309a2-config-data\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.394719 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5g4j\" (UniqueName: \"kubernetes.io/projected/94c3b340-db8b-4ec8-8554-556d914309a2-kube-api-access-m5g4j\") pod \"nova-scheduler-0\" (UID: \"94c3b340-db8b-4ec8-8554-556d914309a2\") " pod="openstack/nova-scheduler-0" Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.489913 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 05:49:14 crc kubenswrapper[4956]: W0930 05:49:14.490295 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a7e115a_efb9_4ea4_aed6_efa6d4b80203.slice/crio-6baffe22a6e20a235a772870f1ca9929a9628a8c2e7cc294383b064eb07e8a89 WatchSource:0}: Error finding container 6baffe22a6e20a235a772870f1ca9929a9628a8c2e7cc294383b064eb07e8a89: Status 404 returned error can't find the container with id 6baffe22a6e20a235a772870f1ca9929a9628a8c2e7cc294383b064eb07e8a89 Sep 30 05:49:14 crc kubenswrapper[4956]: I0930 05:49:14.557799 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.012266 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.173293 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0effab61-b755-4bd2-afb3-71cdf7983dc3","Type":"ContainerStarted","Data":"e0069a9b2297ae47c9498a6ed1f77f3735cfa9ad0267441d534dc51fce68e4ae"} Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.173338 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0effab61-b755-4bd2-afb3-71cdf7983dc3","Type":"ContainerStarted","Data":"b0f179e2d09940cc4ec22a6c875db849b2494fad2bf8776d1b6a24244815b588"} Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.179593 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"94c3b340-db8b-4ec8-8554-556d914309a2","Type":"ContainerStarted","Data":"b6f9d7065be33f23c7366d462392da1fc6819c16154d573df2ae41346624b182"} Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.183374 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a7e115a-efb9-4ea4-aed6-efa6d4b80203","Type":"ContainerStarted","Data":"d61fb2994ff96e2b65139e6ee8780d773af413a14e96d553ce0e97aa4c08f8b1"} Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.183406 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a7e115a-efb9-4ea4-aed6-efa6d4b80203","Type":"ContainerStarted","Data":"e62113b721ab75182e9994122b5ec821d23b46ad9fc6b477775c911b517420d7"} Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.183419 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a7e115a-efb9-4ea4-aed6-efa6d4b80203","Type":"ContainerStarted","Data":"6baffe22a6e20a235a772870f1ca9929a9628a8c2e7cc294383b064eb07e8a89"} Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.218812 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.218797203 podStartE2EDuration="2.218797203s" podCreationTimestamp="2025-09-30 05:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:49:15.195222291 +0000 UTC m=+1225.522342816" watchObservedRunningTime="2025-09-30 05:49:15.218797203 +0000 UTC m=+1225.545917728" Sep 30 05:49:15 crc kubenswrapper[4956]: I0930 05:49:15.222310 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.222300362 podStartE2EDuration="2.222300362s" podCreationTimestamp="2025-09-30 05:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:49:15.218326569 +0000 UTC m=+1225.545447094" watchObservedRunningTime="2025-09-30 05:49:15.222300362 +0000 UTC m=+1225.549420887" Sep 30 05:49:16 crc kubenswrapper[4956]: I0930 05:49:16.196505 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"94c3b340-db8b-4ec8-8554-556d914309a2","Type":"ContainerStarted","Data":"ba4d830a4ca355ed7fddb513acb3714b9f13d824e30eb7b8d8a9a31b394bb475"} Sep 30 05:49:18 crc kubenswrapper[4956]: I0930 05:49:18.073165 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:49:18 crc kubenswrapper[4956]: I0930 05:49:18.073226 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:49:18 crc kubenswrapper[4956]: I0930 05:49:18.535883 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 05:49:18 crc kubenswrapper[4956]: I0930 05:49:18.535940 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 05:49:19 crc kubenswrapper[4956]: I0930 05:49:19.558141 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 05:49:23 crc kubenswrapper[4956]: I0930 05:49:23.536897 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 05:49:23 crc kubenswrapper[4956]: I0930 05:49:23.537617 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 05:49:23 crc kubenswrapper[4956]: I0930 05:49:23.990945 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 05:49:23 crc kubenswrapper[4956]: I0930 05:49:23.991009 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 05:49:24 crc kubenswrapper[4956]: I0930 05:49:24.558362 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0effab61-b755-4bd2-afb3-71cdf7983dc3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 05:49:24 crc kubenswrapper[4956]: I0930 05:49:24.558370 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0effab61-b755-4bd2-afb3-71cdf7983dc3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 05:49:24 crc kubenswrapper[4956]: I0930 05:49:24.558639 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 05:49:24 crc kubenswrapper[4956]: I0930 05:49:24.586812 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 05:49:24 crc kubenswrapper[4956]: I0930 05:49:24.611491 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=10.611466326 podStartE2EDuration="10.611466326s" podCreationTimestamp="2025-09-30 05:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:49:16.222379094 +0000 UTC m=+1226.549499619" watchObservedRunningTime="2025-09-30 05:49:24.611466326 +0000 UTC m=+1234.938586851" Sep 30 05:49:24 crc kubenswrapper[4956]: I0930 05:49:24.999448 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a7e115a-efb9-4ea4-aed6-efa6d4b80203" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 05:49:25 crc kubenswrapper[4956]: I0930 05:49:25.007461 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a7e115a-efb9-4ea4-aed6-efa6d4b80203" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 05:49:25 crc kubenswrapper[4956]: I0930 05:49:25.320408 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 05:49:27 crc kubenswrapper[4956]: I0930 05:49:27.290246 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 05:49:33 crc kubenswrapper[4956]: I0930 05:49:33.543257 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 05:49:33 crc kubenswrapper[4956]: I0930 05:49:33.545652 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 05:49:33 crc kubenswrapper[4956]: I0930 05:49:33.553516 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 05:49:33 crc kubenswrapper[4956]: I0930 05:49:33.999356 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 05:49:33 crc kubenswrapper[4956]: I0930 05:49:33.999730 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 05:49:34 crc kubenswrapper[4956]: I0930 05:49:34.000029 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 05:49:34 crc kubenswrapper[4956]: I0930 05:49:34.000066 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 05:49:34 crc kubenswrapper[4956]: I0930 05:49:34.008834 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 05:49:34 crc kubenswrapper[4956]: I0930 05:49:34.009455 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 05:49:34 crc kubenswrapper[4956]: I0930 05:49:34.384646 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 05:49:42 crc kubenswrapper[4956]: I0930 05:49:42.015361 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:49:42 crc kubenswrapper[4956]: I0930 05:49:42.751394 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:49:45 crc kubenswrapper[4956]: I0930 05:49:45.440755 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerName="rabbitmq" containerID="cri-o://e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc" gracePeriod=604797 Sep 30 05:49:45 crc kubenswrapper[4956]: I0930 05:49:45.997931 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerName="rabbitmq" containerID="cri-o://60e10f761fe551d720bc4696359254ed9521fee7c50401375c9354a6c9aabe0a" gracePeriod=604797 Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.040862 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.231785 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-confd\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.231863 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j2s4\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-kube-api-access-6j2s4\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.231909 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-config-data\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.231949 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ae54b47-b5ac-43a0-9752-797d2f81ff29-erlang-cookie-secret\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.232103 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-plugins\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.232181 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-erlang-cookie\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.232289 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ae54b47-b5ac-43a0-9752-797d2f81ff29-pod-info\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.232373 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-plugins-conf\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.232430 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.232455 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-server-conf\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.232508 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-tls\") pod \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\" (UID: \"0ae54b47-b5ac-43a0-9752-797d2f81ff29\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.235686 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.240678 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.250229 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0ae54b47-b5ac-43a0-9752-797d2f81ff29-pod-info" (OuterVolumeSpecName: "pod-info") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.254438 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae54b47-b5ac-43a0-9752-797d2f81ff29-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.254564 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.255442 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-kube-api-access-6j2s4" (OuterVolumeSpecName: "kube-api-access-6j2s4") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "kube-api-access-6j2s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.256249 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.257559 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.314227 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-config-data" (OuterVolumeSpecName: "config-data") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.330611 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-server-conf" (OuterVolumeSpecName: "server-conf") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335321 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j2s4\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-kube-api-access-6j2s4\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335356 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335366 4956 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ae54b47-b5ac-43a0-9752-797d2f81ff29-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335376 4956 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335385 4956 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335392 4956 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ae54b47-b5ac-43a0-9752-797d2f81ff29-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335400 4956 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335428 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335436 4956 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ae54b47-b5ac-43a0-9752-797d2f81ff29-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.335446 4956 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.365352 4956 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.422683 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0ae54b47-b5ac-43a0-9752-797d2f81ff29" (UID: "0ae54b47-b5ac-43a0-9752-797d2f81ff29"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.440978 4956 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.441019 4956 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ae54b47-b5ac-43a0-9752-797d2f81ff29-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.546594 4956 generic.go:334] "Generic (PLEG): container finished" podID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerID="e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc" exitCode=0 Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.546670 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.546688 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ae54b47-b5ac-43a0-9752-797d2f81ff29","Type":"ContainerDied","Data":"e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc"} Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.546716 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ae54b47-b5ac-43a0-9752-797d2f81ff29","Type":"ContainerDied","Data":"d0d5a22f3362a82099890e3ce6be4ade017eca0e60431422b5b4bb86f161a72a"} Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.546733 4956 scope.go:117] "RemoveContainer" containerID="e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.553701 4956 generic.go:334] "Generic (PLEG): container finished" podID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerID="60e10f761fe551d720bc4696359254ed9521fee7c50401375c9354a6c9aabe0a" exitCode=0 Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.553754 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4","Type":"ContainerDied","Data":"60e10f761fe551d720bc4696359254ed9521fee7c50401375c9354a6c9aabe0a"} Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.569382 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.585972 4956 scope.go:117] "RemoveContainer" containerID="a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.593033 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.602290 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.612338 4956 scope.go:117] "RemoveContainer" containerID="e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.615448 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:49:47 crc kubenswrapper[4956]: E0930 05:49:47.615997 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerName="rabbitmq" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.616018 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerName="rabbitmq" Sep 30 05:49:47 crc kubenswrapper[4956]: E0930 05:49:47.616029 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerName="setup-container" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.616036 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerName="setup-container" Sep 30 05:49:47 crc kubenswrapper[4956]: E0930 05:49:47.616054 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerName="setup-container" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.616060 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerName="setup-container" Sep 30 05:49:47 crc kubenswrapper[4956]: E0930 05:49:47.616077 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerName="rabbitmq" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.616082 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerName="rabbitmq" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.616391 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" containerName="rabbitmq" Sep 30 05:49:47 crc kubenswrapper[4956]: E0930 05:49:47.616490 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc\": container with ID starting with e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc not found: ID does not exist" containerID="e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.616518 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc"} err="failed to get container status \"e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc\": rpc error: code = NotFound desc = could not find container \"e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc\": container with ID starting with e61c50c554e0a10a533ac8dce9b20996c6d75b3a451a55fe190f70019c3dd3cc not found: ID does not exist" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.616543 4956 scope.go:117] "RemoveContainer" containerID="a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.616640 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" containerName="rabbitmq" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.617733 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: E0930 05:49:47.618839 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25\": container with ID starting with a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25 not found: ID does not exist" containerID="a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.618895 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25"} err="failed to get container status \"a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25\": rpc error: code = NotFound desc = could not find container \"a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25\": container with ID starting with a061664973516acc797dccde5fe413c411f5ff1d462127643f441eef8a326d25 not found: ID does not exist" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.619559 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.619773 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.620312 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.620547 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.620833 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.621017 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8hjkc" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.622165 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.644778 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745516 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dtpc\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-kube-api-access-4dtpc\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745608 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-confd\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745633 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-erlang-cookie-secret\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745674 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745697 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-server-conf\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745731 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-tls\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745803 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-plugins\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745854 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-erlang-cookie\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745893 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-pod-info\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745924 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-config-data\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.745969 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-plugins-conf\") pod \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\" (UID: \"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4\") " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746230 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746293 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746318 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kmp6\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-kube-api-access-9kmp6\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746345 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746376 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746418 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac3f1eed-d128-4008-bbe5-0f319495ef52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746455 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-config-data\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746509 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746543 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac3f1eed-d128-4008-bbe5-0f319495ef52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746585 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.746602 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.747955 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.747972 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.747971 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.752224 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.752862 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.752974 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-kube-api-access-4dtpc" (OuterVolumeSpecName: "kube-api-access-4dtpc") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "kube-api-access-4dtpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.758969 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.763898 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-pod-info" (OuterVolumeSpecName: "pod-info") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.772926 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-config-data" (OuterVolumeSpecName: "config-data") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.815230 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-server-conf" (OuterVolumeSpecName: "server-conf") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848460 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848515 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac3f1eed-d128-4008-bbe5-0f319495ef52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848558 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848578 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848627 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848665 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848694 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kmp6\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-kube-api-access-9kmp6\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848716 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848741 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848773 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac3f1eed-d128-4008-bbe5-0f319495ef52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848874 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-config-data\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848916 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848943 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848956 4956 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848964 4956 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848974 4956 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848983 4956 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.848991 4956 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.849001 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.849009 4956 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.849017 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dtpc\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-kube-api-access-4dtpc\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.849025 4956 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.849507 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.850340 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.850448 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.850910 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.850952 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac3f1eed-d128-4008-bbe5-0f319495ef52-config-data\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.853923 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac3f1eed-d128-4008-bbe5-0f319495ef52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.855438 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.856831 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.857254 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac3f1eed-d128-4008-bbe5-0f319495ef52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.866775 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kmp6\" (UniqueName: \"kubernetes.io/projected/ac3f1eed-d128-4008-bbe5-0f319495ef52-kube-api-access-9kmp6\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.877029 4956 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.889881 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" (UID: "8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.890543 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"ac3f1eed-d128-4008-bbe5-0f319495ef52\") " pod="openstack/rabbitmq-server-0" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.950654 4956 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.950692 4956 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 05:49:47 crc kubenswrapper[4956]: I0930 05:49:47.979360 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.073583 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.073629 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.354822 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae54b47-b5ac-43a0-9752-797d2f81ff29" path="/var/lib/kubelet/pods/0ae54b47-b5ac-43a0-9752-797d2f81ff29/volumes" Sep 30 05:49:48 crc kubenswrapper[4956]: W0930 05:49:48.417036 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3f1eed_d128_4008_bbe5_0f319495ef52.slice/crio-5567bcd8cf7bfe1446ddd6dc82e4d5d1235ea3139c1160ce819b9f20e5ed7998 WatchSource:0}: Error finding container 5567bcd8cf7bfe1446ddd6dc82e4d5d1235ea3139c1160ce819b9f20e5ed7998: Status 404 returned error can't find the container with id 5567bcd8cf7bfe1446ddd6dc82e4d5d1235ea3139c1160ce819b9f20e5ed7998 Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.423983 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.564035 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ac3f1eed-d128-4008-bbe5-0f319495ef52","Type":"ContainerStarted","Data":"5567bcd8cf7bfe1446ddd6dc82e4d5d1235ea3139c1160ce819b9f20e5ed7998"} Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.567707 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4","Type":"ContainerDied","Data":"7fef3a34e10a3a8f9c7f222c85bddcca9409b06a47aa55d2c9065e71a5c8709a"} Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.567744 4956 scope.go:117] "RemoveContainer" containerID="60e10f761fe551d720bc4696359254ed9521fee7c50401375c9354a6c9aabe0a" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.567794 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.608947 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.630027 4956 scope.go:117] "RemoveContainer" containerID="1dd46384bd89f5a02cbb07228ce6eb6563ced7e59f1d12f3cc83c531865ff7de" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.635211 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.642392 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.644568 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.649520 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.651802 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.652621 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.653014 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rxl6n" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.653068 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.653080 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.653149 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.677043 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.764421 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a47b02d9-bc18-4fa1-a0a9-1918de176de9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.764655 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a47b02d9-bc18-4fa1-a0a9-1918de176de9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.764797 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.764880 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.764953 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.765048 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsmls\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-kube-api-access-hsmls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.765129 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.765224 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.765343 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.765467 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.765770 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.867598 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a47b02d9-bc18-4fa1-a0a9-1918de176de9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.868325 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a47b02d9-bc18-4fa1-a0a9-1918de176de9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.868358 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.868383 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.868406 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.868438 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsmls\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-kube-api-access-hsmls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.868930 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.869065 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.869193 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.869263 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.869418 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.869512 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.869562 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.869628 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.869977 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.870358 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.870436 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a47b02d9-bc18-4fa1-a0a9-1918de176de9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.873889 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.880065 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a47b02d9-bc18-4fa1-a0a9-1918de176de9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.880522 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a47b02d9-bc18-4fa1-a0a9-1918de176de9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.885060 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.887691 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsmls\" (UniqueName: \"kubernetes.io/projected/a47b02d9-bc18-4fa1-a0a9-1918de176de9-kube-api-access-hsmls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.916138 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a47b02d9-bc18-4fa1-a0a9-1918de176de9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:48 crc kubenswrapper[4956]: I0930 05:49:48.993268 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:49:49 crc kubenswrapper[4956]: I0930 05:49:49.430818 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 05:49:49 crc kubenswrapper[4956]: W0930 05:49:49.432584 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda47b02d9_bc18_4fa1_a0a9_1918de176de9.slice/crio-1693aaf27423efd1b8967cc6c1a7d46dd29fa7c74a0eadd4c493e2a8eed05285 WatchSource:0}: Error finding container 1693aaf27423efd1b8967cc6c1a7d46dd29fa7c74a0eadd4c493e2a8eed05285: Status 404 returned error can't find the container with id 1693aaf27423efd1b8967cc6c1a7d46dd29fa7c74a0eadd4c493e2a8eed05285 Sep 30 05:49:49 crc kubenswrapper[4956]: I0930 05:49:49.587699 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ac3f1eed-d128-4008-bbe5-0f319495ef52","Type":"ContainerStarted","Data":"698211aeb94a111b56dbd01a2c44b451bbcde240112dc4b4dd694362064485c4"} Sep 30 05:49:49 crc kubenswrapper[4956]: I0930 05:49:49.589690 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a47b02d9-bc18-4fa1-a0a9-1918de176de9","Type":"ContainerStarted","Data":"1693aaf27423efd1b8967cc6c1a7d46dd29fa7c74a0eadd4c493e2a8eed05285"} Sep 30 05:49:50 crc kubenswrapper[4956]: I0930 05:49:50.362914 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4" path="/var/lib/kubelet/pods/8fd85fdd-3cb1-4a7e-9aea-8823050ae1f4/volumes" Sep 30 05:49:50 crc kubenswrapper[4956]: I0930 05:49:50.602615 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a47b02d9-bc18-4fa1-a0a9-1918de176de9","Type":"ContainerStarted","Data":"e6f81266c254cc24e88c748f5a1f8e507a715837b76024f443fb521966434a41"} Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.859889 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-kdlv5"] Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.862292 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.869075 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.891884 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-kdlv5"] Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.924105 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.924169 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4s7v\" (UniqueName: \"kubernetes.io/projected/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-kube-api-access-q4s7v\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.924331 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.924365 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-config\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.924420 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.924458 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:56 crc kubenswrapper[4956]: I0930 05:49:56.924483 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.026628 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.026706 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.026741 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.026789 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.026811 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4s7v\" (UniqueName: \"kubernetes.io/projected/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-kube-api-access-q4s7v\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.026876 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.026902 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-config\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.027703 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.027936 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.027938 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-config\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.028356 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.028770 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.028978 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.046192 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4s7v\" (UniqueName: \"kubernetes.io/projected/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-kube-api-access-q4s7v\") pod \"dnsmasq-dns-bf6c7df67-kdlv5\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.179522 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.625276 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-kdlv5"] Sep 30 05:49:57 crc kubenswrapper[4956]: I0930 05:49:57.668368 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" event={"ID":"a1d5c3da-4ce1-4408-b18d-6dae65f47b36","Type":"ContainerStarted","Data":"22d0165b49a7dcb14f2ff33b6ecf418245a07bdaab5100e3ef9bac4d78bfb366"} Sep 30 05:49:58 crc kubenswrapper[4956]: I0930 05:49:58.680158 4956 generic.go:334] "Generic (PLEG): container finished" podID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" containerID="ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6" exitCode=0 Sep 30 05:49:58 crc kubenswrapper[4956]: I0930 05:49:58.680259 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" event={"ID":"a1d5c3da-4ce1-4408-b18d-6dae65f47b36","Type":"ContainerDied","Data":"ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6"} Sep 30 05:49:59 crc kubenswrapper[4956]: I0930 05:49:59.694256 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" event={"ID":"a1d5c3da-4ce1-4408-b18d-6dae65f47b36","Type":"ContainerStarted","Data":"ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc"} Sep 30 05:49:59 crc kubenswrapper[4956]: I0930 05:49:59.694783 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:49:59 crc kubenswrapper[4956]: I0930 05:49:59.723432 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" podStartSLOduration=3.7234158280000003 podStartE2EDuration="3.723415828s" podCreationTimestamp="2025-09-30 05:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:49:59.716706677 +0000 UTC m=+1270.043827222" watchObservedRunningTime="2025-09-30 05:49:59.723415828 +0000 UTC m=+1270.050536353" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.182423 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.296806 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-82dqm"] Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.297159 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" podUID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" containerName="dnsmasq-dns" containerID="cri-o://fce76c36eca57c3cdbbc9a180515704cccf70830def5459def56dd92abaa6bc2" gracePeriod=10 Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.467271 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-dnptc"] Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.469827 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.483773 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-dnptc"] Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.546105 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.546179 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.546320 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.546347 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-config\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.546439 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vj47\" (UniqueName: \"kubernetes.io/projected/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-kube-api-access-4vj47\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.546469 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.546490 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.648655 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vj47\" (UniqueName: \"kubernetes.io/projected/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-kube-api-access-4vj47\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.648724 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.648745 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.648868 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.648888 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.648990 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.649025 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-config\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.649963 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-config\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.649974 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.650140 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.650536 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.650642 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.650670 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.679323 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vj47\" (UniqueName: \"kubernetes.io/projected/c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee-kube-api-access-4vj47\") pod \"dnsmasq-dns-77b58f4b85-dnptc\" (UID: \"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee\") " pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.784704 4956 generic.go:334] "Generic (PLEG): container finished" podID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" containerID="fce76c36eca57c3cdbbc9a180515704cccf70830def5459def56dd92abaa6bc2" exitCode=0 Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.784748 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" event={"ID":"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7","Type":"ContainerDied","Data":"fce76c36eca57c3cdbbc9a180515704cccf70830def5459def56dd92abaa6bc2"} Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.802188 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.924315 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.953993 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-swift-storage-0\") pod \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.954037 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-sb\") pod \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.954066 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-649l8\" (UniqueName: \"kubernetes.io/projected/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-kube-api-access-649l8\") pod \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.954089 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-config\") pod \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.954273 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-svc\") pod \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.954323 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-nb\") pod \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\" (UID: \"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7\") " Sep 30 05:50:07 crc kubenswrapper[4956]: I0930 05:50:07.959932 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-kube-api-access-649l8" (OuterVolumeSpecName: "kube-api-access-649l8") pod "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" (UID: "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7"). InnerVolumeSpecName "kube-api-access-649l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.027824 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" (UID: "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.035744 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" (UID: "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.042058 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" (UID: "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.043514 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-config" (OuterVolumeSpecName: "config") pod "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" (UID: "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.056630 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.056670 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-649l8\" (UniqueName: \"kubernetes.io/projected/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-kube-api-access-649l8\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.056681 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.056692 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.056699 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.060206 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" (UID: "09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.158680 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.274676 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-dnptc"] Sep 30 05:50:08 crc kubenswrapper[4956]: W0930 05:50:08.276907 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc38cc2b9_0cab_4a36_8d6e_070fbb2b6bee.slice/crio-c8f59502539275184595f54f9e197a2d82df060f5966ecb0c5d6671bb8b94635 WatchSource:0}: Error finding container c8f59502539275184595f54f9e197a2d82df060f5966ecb0c5d6671bb8b94635: Status 404 returned error can't find the container with id c8f59502539275184595f54f9e197a2d82df060f5966ecb0c5d6671bb8b94635 Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.797015 4956 generic.go:334] "Generic (PLEG): container finished" podID="c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee" containerID="d84f174186b8c715cc01ba65a2c175594d4467b014f9262af284401a24a8516b" exitCode=0 Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.797066 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" event={"ID":"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee","Type":"ContainerDied","Data":"d84f174186b8c715cc01ba65a2c175594d4467b014f9262af284401a24a8516b"} Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.797707 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" event={"ID":"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee","Type":"ContainerStarted","Data":"c8f59502539275184595f54f9e197a2d82df060f5966ecb0c5d6671bb8b94635"} Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.800799 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.800854 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-82dqm" event={"ID":"09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7","Type":"ContainerDied","Data":"d1b3f3098425edd8ff112d35ca5e4e2d74d7164bb8ff3d297ddedf99d260fd7d"} Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.800894 4956 scope.go:117] "RemoveContainer" containerID="fce76c36eca57c3cdbbc9a180515704cccf70830def5459def56dd92abaa6bc2" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.833353 4956 scope.go:117] "RemoveContainer" containerID="78506152d99ea2083e4224957f6fb315167fa2dc7c97010edfdf553406f8e7fc" Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.850305 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-82dqm"] Sep 30 05:50:08 crc kubenswrapper[4956]: I0930 05:50:08.859692 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-82dqm"] Sep 30 05:50:09 crc kubenswrapper[4956]: I0930 05:50:09.826918 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" event={"ID":"c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee","Type":"ContainerStarted","Data":"50aea13444e024c8a1e2cd896b8cf5b5073de2a82cffb14615d3c03e15644f45"} Sep 30 05:50:09 crc kubenswrapper[4956]: I0930 05:50:09.828053 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:09 crc kubenswrapper[4956]: I0930 05:50:09.844439 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" podStartSLOduration=2.844424411 podStartE2EDuration="2.844424411s" podCreationTimestamp="2025-09-30 05:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:50:09.844261226 +0000 UTC m=+1280.171381751" watchObservedRunningTime="2025-09-30 05:50:09.844424411 +0000 UTC m=+1280.171544936" Sep 30 05:50:10 crc kubenswrapper[4956]: I0930 05:50:10.361343 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" path="/var/lib/kubelet/pods/09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7/volumes" Sep 30 05:50:17 crc kubenswrapper[4956]: I0930 05:50:17.804319 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77b58f4b85-dnptc" Sep 30 05:50:17 crc kubenswrapper[4956]: I0930 05:50:17.877966 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-kdlv5"] Sep 30 05:50:17 crc kubenswrapper[4956]: I0930 05:50:17.878335 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" podUID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" containerName="dnsmasq-dns" containerID="cri-o://ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc" gracePeriod=10 Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.073144 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.073495 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.073537 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.074385 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d6707b9d26600f66963acd8d20bb882ca13db20f72f914fce4d118ce50f76933"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.074449 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://d6707b9d26600f66963acd8d20bb882ca13db20f72f914fce4d118ce50f76933" gracePeriod=600 Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.393694 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.480429 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-swift-storage-0\") pod \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.480469 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-openstack-edpm-ipam\") pod \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.480579 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-nb\") pod \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.480610 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-config\") pod \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.480642 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-svc\") pod \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.480692 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4s7v\" (UniqueName: \"kubernetes.io/projected/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-kube-api-access-q4s7v\") pod \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.480713 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-sb\") pod \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\" (UID: \"a1d5c3da-4ce1-4408-b18d-6dae65f47b36\") " Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.485457 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-kube-api-access-q4s7v" (OuterVolumeSpecName: "kube-api-access-q4s7v") pod "a1d5c3da-4ce1-4408-b18d-6dae65f47b36" (UID: "a1d5c3da-4ce1-4408-b18d-6dae65f47b36"). InnerVolumeSpecName "kube-api-access-q4s7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.540725 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a1d5c3da-4ce1-4408-b18d-6dae65f47b36" (UID: "a1d5c3da-4ce1-4408-b18d-6dae65f47b36"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.543680 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1d5c3da-4ce1-4408-b18d-6dae65f47b36" (UID: "a1d5c3da-4ce1-4408-b18d-6dae65f47b36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.546686 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a1d5c3da-4ce1-4408-b18d-6dae65f47b36" (UID: "a1d5c3da-4ce1-4408-b18d-6dae65f47b36"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.551192 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-config" (OuterVolumeSpecName: "config") pod "a1d5c3da-4ce1-4408-b18d-6dae65f47b36" (UID: "a1d5c3da-4ce1-4408-b18d-6dae65f47b36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.563416 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a1d5c3da-4ce1-4408-b18d-6dae65f47b36" (UID: "a1d5c3da-4ce1-4408-b18d-6dae65f47b36"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.570901 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a1d5c3da-4ce1-4408-b18d-6dae65f47b36" (UID: "a1d5c3da-4ce1-4408-b18d-6dae65f47b36"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.583215 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.583255 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.583269 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-config\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.583281 4956 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.583292 4956 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.583302 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4s7v\" (UniqueName: \"kubernetes.io/projected/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-kube-api-access-q4s7v\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.583314 4956 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1d5c3da-4ce1-4408-b18d-6dae65f47b36-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.948304 4956 generic.go:334] "Generic (PLEG): container finished" podID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" containerID="ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc" exitCode=0 Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.948422 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.948438 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" event={"ID":"a1d5c3da-4ce1-4408-b18d-6dae65f47b36","Type":"ContainerDied","Data":"ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc"} Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.948976 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-kdlv5" event={"ID":"a1d5c3da-4ce1-4408-b18d-6dae65f47b36","Type":"ContainerDied","Data":"22d0165b49a7dcb14f2ff33b6ecf418245a07bdaab5100e3ef9bac4d78bfb366"} Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.949010 4956 scope.go:117] "RemoveContainer" containerID="ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc" Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.960043 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="d6707b9d26600f66963acd8d20bb882ca13db20f72f914fce4d118ce50f76933" exitCode=0 Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.960095 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"d6707b9d26600f66963acd8d20bb882ca13db20f72f914fce4d118ce50f76933"} Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.960197 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b"} Sep 30 05:50:18 crc kubenswrapper[4956]: I0930 05:50:18.993216 4956 scope.go:117] "RemoveContainer" containerID="ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6" Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.026465 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-kdlv5"] Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.041760 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-kdlv5"] Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.070049 4956 scope.go:117] "RemoveContainer" containerID="ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc" Sep 30 05:50:19 crc kubenswrapper[4956]: E0930 05:50:19.070596 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc\": container with ID starting with ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc not found: ID does not exist" containerID="ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc" Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.070637 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc"} err="failed to get container status \"ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc\": rpc error: code = NotFound desc = could not find container \"ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc\": container with ID starting with ca25111a8227c6a50cdf6bb07a0cd5e05f9107bb53c03840f3cfe8cf281907dc not found: ID does not exist" Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.070664 4956 scope.go:117] "RemoveContainer" containerID="ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6" Sep 30 05:50:19 crc kubenswrapper[4956]: E0930 05:50:19.071025 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6\": container with ID starting with ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6 not found: ID does not exist" containerID="ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6" Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.071052 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6"} err="failed to get container status \"ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6\": rpc error: code = NotFound desc = could not find container \"ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6\": container with ID starting with ee8e6832c3d4029558e345c19d54c95ab59b728f99243f6ba50a7607f067c2b6 not found: ID does not exist" Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.071071 4956 scope.go:117] "RemoveContainer" containerID="f437f59e0bbde583a02d6871ab0b4a73cd7224d37812609e0ec49e3d0eab2998" Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.973030 4956 generic.go:334] "Generic (PLEG): container finished" podID="ac3f1eed-d128-4008-bbe5-0f319495ef52" containerID="698211aeb94a111b56dbd01a2c44b451bbcde240112dc4b4dd694362064485c4" exitCode=0 Sep 30 05:50:19 crc kubenswrapper[4956]: I0930 05:50:19.973184 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ac3f1eed-d128-4008-bbe5-0f319495ef52","Type":"ContainerDied","Data":"698211aeb94a111b56dbd01a2c44b451bbcde240112dc4b4dd694362064485c4"} Sep 30 05:50:20 crc kubenswrapper[4956]: I0930 05:50:20.364025 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" path="/var/lib/kubelet/pods/a1d5c3da-4ce1-4408-b18d-6dae65f47b36/volumes" Sep 30 05:50:21 crc kubenswrapper[4956]: I0930 05:50:21.008811 4956 generic.go:334] "Generic (PLEG): container finished" podID="a47b02d9-bc18-4fa1-a0a9-1918de176de9" containerID="e6f81266c254cc24e88c748f5a1f8e507a715837b76024f443fb521966434a41" exitCode=0 Sep 30 05:50:21 crc kubenswrapper[4956]: I0930 05:50:21.008914 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a47b02d9-bc18-4fa1-a0a9-1918de176de9","Type":"ContainerDied","Data":"e6f81266c254cc24e88c748f5a1f8e507a715837b76024f443fb521966434a41"} Sep 30 05:50:21 crc kubenswrapper[4956]: I0930 05:50:21.012140 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ac3f1eed-d128-4008-bbe5-0f319495ef52","Type":"ContainerStarted","Data":"fd6141080a7dd8d305ce86371df77b5d6d6f0ce3e85f9ae4487d4ea022c60fd3"} Sep 30 05:50:21 crc kubenswrapper[4956]: I0930 05:50:21.012613 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 05:50:21 crc kubenswrapper[4956]: I0930 05:50:21.073276 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.07325355 podStartE2EDuration="34.07325355s" podCreationTimestamp="2025-09-30 05:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:50:21.07292927 +0000 UTC m=+1291.400049815" watchObservedRunningTime="2025-09-30 05:50:21.07325355 +0000 UTC m=+1291.400374075" Sep 30 05:50:22 crc kubenswrapper[4956]: I0930 05:50:22.022671 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a47b02d9-bc18-4fa1-a0a9-1918de176de9","Type":"ContainerStarted","Data":"d4c61cf65c00dd2860d0d0f6ef1df0cc77006de3c9bd146ac47ba1e4c2115ef6"} Sep 30 05:50:22 crc kubenswrapper[4956]: I0930 05:50:22.023328 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:50:22 crc kubenswrapper[4956]: I0930 05:50:22.050014 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.049994286 podStartE2EDuration="34.049994286s" podCreationTimestamp="2025-09-30 05:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 05:50:22.046539607 +0000 UTC m=+1292.373660142" watchObservedRunningTime="2025-09-30 05:50:22.049994286 +0000 UTC m=+1292.377114821" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.954273 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627"] Sep 30 05:50:35 crc kubenswrapper[4956]: E0930 05:50:35.955060 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" containerName="init" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.955098 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" containerName="init" Sep 30 05:50:35 crc kubenswrapper[4956]: E0930 05:50:35.955183 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" containerName="init" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.955193 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" containerName="init" Sep 30 05:50:35 crc kubenswrapper[4956]: E0930 05:50:35.955202 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" containerName="dnsmasq-dns" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.955208 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" containerName="dnsmasq-dns" Sep 30 05:50:35 crc kubenswrapper[4956]: E0930 05:50:35.955231 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" containerName="dnsmasq-dns" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.955236 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" containerName="dnsmasq-dns" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.955409 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="09aa9a8a-09d9-4d6d-bc52-cdd26403a0c7" containerName="dnsmasq-dns" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.955420 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d5c3da-4ce1-4408-b18d-6dae65f47b36" containerName="dnsmasq-dns" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.956319 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.962690 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.962861 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.963088 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.963466 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627"] Sep 30 05:50:35 crc kubenswrapper[4956]: I0930 05:50:35.963759 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.043655 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.043751 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.043861 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.043883 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2cpw\" (UniqueName: \"kubernetes.io/projected/2b60bb30-87ca-43de-a737-1a0fc105197e-kube-api-access-j2cpw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.145605 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2cpw\" (UniqueName: \"kubernetes.io/projected/2b60bb30-87ca-43de-a737-1a0fc105197e-kube-api-access-j2cpw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.145716 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.145751 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.145848 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.152968 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.153370 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.157709 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.163528 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2cpw\" (UniqueName: \"kubernetes.io/projected/2b60bb30-87ca-43de-a737-1a0fc105197e-kube-api-access-j2cpw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fc627\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:36 crc kubenswrapper[4956]: I0930 05:50:36.274006 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:50:37 crc kubenswrapper[4956]: I0930 05:50:37.069073 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627"] Sep 30 05:50:37 crc kubenswrapper[4956]: I0930 05:50:37.181751 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" event={"ID":"2b60bb30-87ca-43de-a737-1a0fc105197e","Type":"ContainerStarted","Data":"75a65defe585eef1476698a31a62c319f191eb341582260c1f32efdecf696e5b"} Sep 30 05:50:37 crc kubenswrapper[4956]: I0930 05:50:37.990430 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 05:50:38 crc kubenswrapper[4956]: I0930 05:50:38.997467 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 05:50:46 crc kubenswrapper[4956]: I0930 05:50:46.271001 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" event={"ID":"2b60bb30-87ca-43de-a737-1a0fc105197e","Type":"ContainerStarted","Data":"3a7076208cb817ffaa726100c5815d898e2dbf0af78f0b216673783b252e8c9e"} Sep 30 05:50:46 crc kubenswrapper[4956]: I0930 05:50:46.294398 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" podStartSLOduration=2.434267919 podStartE2EDuration="11.294378007s" podCreationTimestamp="2025-09-30 05:50:35 +0000 UTC" firstStartedPulling="2025-09-30 05:50:37.078478913 +0000 UTC m=+1307.405599448" lastFinishedPulling="2025-09-30 05:50:45.938589011 +0000 UTC m=+1316.265709536" observedRunningTime="2025-09-30 05:50:46.288151951 +0000 UTC m=+1316.615272516" watchObservedRunningTime="2025-09-30 05:50:46.294378007 +0000 UTC m=+1316.621498532" Sep 30 05:50:58 crc kubenswrapper[4956]: I0930 05:50:58.434551 4956 generic.go:334] "Generic (PLEG): container finished" podID="2b60bb30-87ca-43de-a737-1a0fc105197e" containerID="3a7076208cb817ffaa726100c5815d898e2dbf0af78f0b216673783b252e8c9e" exitCode=0 Sep 30 05:50:58 crc kubenswrapper[4956]: I0930 05:50:58.434633 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" event={"ID":"2b60bb30-87ca-43de-a737-1a0fc105197e","Type":"ContainerDied","Data":"3a7076208cb817ffaa726100c5815d898e2dbf0af78f0b216673783b252e8c9e"} Sep 30 05:50:59 crc kubenswrapper[4956]: I0930 05:50:59.936656 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.038790 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-repo-setup-combined-ca-bundle\") pod \"2b60bb30-87ca-43de-a737-1a0fc105197e\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.039261 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2cpw\" (UniqueName: \"kubernetes.io/projected/2b60bb30-87ca-43de-a737-1a0fc105197e-kube-api-access-j2cpw\") pod \"2b60bb30-87ca-43de-a737-1a0fc105197e\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.039459 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-ssh-key\") pod \"2b60bb30-87ca-43de-a737-1a0fc105197e\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.039636 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-inventory\") pod \"2b60bb30-87ca-43de-a737-1a0fc105197e\" (UID: \"2b60bb30-87ca-43de-a737-1a0fc105197e\") " Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.045438 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2b60bb30-87ca-43de-a737-1a0fc105197e" (UID: "2b60bb30-87ca-43de-a737-1a0fc105197e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.046282 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b60bb30-87ca-43de-a737-1a0fc105197e-kube-api-access-j2cpw" (OuterVolumeSpecName: "kube-api-access-j2cpw") pod "2b60bb30-87ca-43de-a737-1a0fc105197e" (UID: "2b60bb30-87ca-43de-a737-1a0fc105197e"). InnerVolumeSpecName "kube-api-access-j2cpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.067594 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-inventory" (OuterVolumeSpecName: "inventory") pod "2b60bb30-87ca-43de-a737-1a0fc105197e" (UID: "2b60bb30-87ca-43de-a737-1a0fc105197e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.073304 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b60bb30-87ca-43de-a737-1a0fc105197e" (UID: "2b60bb30-87ca-43de-a737-1a0fc105197e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.142523 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.142579 4956 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.142597 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2cpw\" (UniqueName: \"kubernetes.io/projected/2b60bb30-87ca-43de-a737-1a0fc105197e-kube-api-access-j2cpw\") on node \"crc\" DevicePath \"\"" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.142612 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b60bb30-87ca-43de-a737-1a0fc105197e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.457870 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" event={"ID":"2b60bb30-87ca-43de-a737-1a0fc105197e","Type":"ContainerDied","Data":"75a65defe585eef1476698a31a62c319f191eb341582260c1f32efdecf696e5b"} Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.457937 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fc627" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.457948 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75a65defe585eef1476698a31a62c319f191eb341582260c1f32efdecf696e5b" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.552429 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt"] Sep 30 05:51:00 crc kubenswrapper[4956]: E0930 05:51:00.552943 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b60bb30-87ca-43de-a737-1a0fc105197e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.552964 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b60bb30-87ca-43de-a737-1a0fc105197e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.553189 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b60bb30-87ca-43de-a737-1a0fc105197e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.553904 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.556410 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.557036 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.561640 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.561742 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.564239 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt"] Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.652920 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.652996 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.653076 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prbs\" (UniqueName: \"kubernetes.io/projected/70901491-8063-4429-baee-c1a295960e2c-kube-api-access-8prbs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.766905 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.767014 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.767150 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8prbs\" (UniqueName: \"kubernetes.io/projected/70901491-8063-4429-baee-c1a295960e2c-kube-api-access-8prbs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.772036 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.773366 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.784367 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prbs\" (UniqueName: \"kubernetes.io/projected/70901491-8063-4429-baee-c1a295960e2c-kube-api-access-8prbs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-6c6wt\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:00 crc kubenswrapper[4956]: I0930 05:51:00.884603 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:01 crc kubenswrapper[4956]: I0930 05:51:01.415035 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt"] Sep 30 05:51:01 crc kubenswrapper[4956]: W0930 05:51:01.416519 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70901491_8063_4429_baee_c1a295960e2c.slice/crio-350af098ba76d6aca9c7c3861a7abd060fe7aab84084bb168531628e4cadbff4 WatchSource:0}: Error finding container 350af098ba76d6aca9c7c3861a7abd060fe7aab84084bb168531628e4cadbff4: Status 404 returned error can't find the container with id 350af098ba76d6aca9c7c3861a7abd060fe7aab84084bb168531628e4cadbff4 Sep 30 05:51:01 crc kubenswrapper[4956]: I0930 05:51:01.469397 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" event={"ID":"70901491-8063-4429-baee-c1a295960e2c","Type":"ContainerStarted","Data":"350af098ba76d6aca9c7c3861a7abd060fe7aab84084bb168531628e4cadbff4"} Sep 30 05:51:02 crc kubenswrapper[4956]: I0930 05:51:02.478464 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" event={"ID":"70901491-8063-4429-baee-c1a295960e2c","Type":"ContainerStarted","Data":"e54786f47c7202c6bc35b8b8eacf01435d22eb177f06fa6750ca585b3c8eb727"} Sep 30 05:51:02 crc kubenswrapper[4956]: I0930 05:51:02.498854 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" podStartSLOduration=1.809263455 podStartE2EDuration="2.498837987s" podCreationTimestamp="2025-09-30 05:51:00 +0000 UTC" firstStartedPulling="2025-09-30 05:51:01.419517464 +0000 UTC m=+1331.746637989" lastFinishedPulling="2025-09-30 05:51:02.109091986 +0000 UTC m=+1332.436212521" observedRunningTime="2025-09-30 05:51:02.493571631 +0000 UTC m=+1332.820692156" watchObservedRunningTime="2025-09-30 05:51:02.498837987 +0000 UTC m=+1332.825958512" Sep 30 05:51:05 crc kubenswrapper[4956]: I0930 05:51:05.514946 4956 generic.go:334] "Generic (PLEG): container finished" podID="70901491-8063-4429-baee-c1a295960e2c" containerID="e54786f47c7202c6bc35b8b8eacf01435d22eb177f06fa6750ca585b3c8eb727" exitCode=0 Sep 30 05:51:05 crc kubenswrapper[4956]: I0930 05:51:05.515025 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" event={"ID":"70901491-8063-4429-baee-c1a295960e2c","Type":"ContainerDied","Data":"e54786f47c7202c6bc35b8b8eacf01435d22eb177f06fa6750ca585b3c8eb727"} Sep 30 05:51:06 crc kubenswrapper[4956]: I0930 05:51:06.950552 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.088489 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8prbs\" (UniqueName: \"kubernetes.io/projected/70901491-8063-4429-baee-c1a295960e2c-kube-api-access-8prbs\") pod \"70901491-8063-4429-baee-c1a295960e2c\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.089051 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-inventory\") pod \"70901491-8063-4429-baee-c1a295960e2c\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.089227 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-ssh-key\") pod \"70901491-8063-4429-baee-c1a295960e2c\" (UID: \"70901491-8063-4429-baee-c1a295960e2c\") " Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.094647 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70901491-8063-4429-baee-c1a295960e2c-kube-api-access-8prbs" (OuterVolumeSpecName: "kube-api-access-8prbs") pod "70901491-8063-4429-baee-c1a295960e2c" (UID: "70901491-8063-4429-baee-c1a295960e2c"). InnerVolumeSpecName "kube-api-access-8prbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.115780 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70901491-8063-4429-baee-c1a295960e2c" (UID: "70901491-8063-4429-baee-c1a295960e2c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.132958 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-inventory" (OuterVolumeSpecName: "inventory") pod "70901491-8063-4429-baee-c1a295960e2c" (UID: "70901491-8063-4429-baee-c1a295960e2c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.192297 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.192342 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70901491-8063-4429-baee-c1a295960e2c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.192351 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8prbs\" (UniqueName: \"kubernetes.io/projected/70901491-8063-4429-baee-c1a295960e2c-kube-api-access-8prbs\") on node \"crc\" DevicePath \"\"" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.540035 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" event={"ID":"70901491-8063-4429-baee-c1a295960e2c","Type":"ContainerDied","Data":"350af098ba76d6aca9c7c3861a7abd060fe7aab84084bb168531628e4cadbff4"} Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.540079 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350af098ba76d6aca9c7c3861a7abd060fe7aab84084bb168531628e4cadbff4" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.540275 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-6c6wt" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.618095 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv"] Sep 30 05:51:07 crc kubenswrapper[4956]: E0930 05:51:07.618488 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70901491-8063-4429-baee-c1a295960e2c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.618504 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="70901491-8063-4429-baee-c1a295960e2c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.618713 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="70901491-8063-4429-baee-c1a295960e2c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.627030 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.628783 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.628820 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv"] Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.629916 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.630050 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.632996 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.701609 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.701673 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjq9d\" (UniqueName: \"kubernetes.io/projected/dc55cabb-5e01-4012-a02f-27ee023df0c4-kube-api-access-tjq9d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.701714 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.701789 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.803945 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.804106 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjq9d\" (UniqueName: \"kubernetes.io/projected/dc55cabb-5e01-4012-a02f-27ee023df0c4-kube-api-access-tjq9d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.804262 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.804471 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.808712 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.808805 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.809158 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.825161 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjq9d\" (UniqueName: \"kubernetes.io/projected/dc55cabb-5e01-4012-a02f-27ee023df0c4-kube-api-access-tjq9d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:07 crc kubenswrapper[4956]: I0930 05:51:07.953590 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:51:08 crc kubenswrapper[4956]: I0930 05:51:08.509881 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv"] Sep 30 05:51:08 crc kubenswrapper[4956]: I0930 05:51:08.552325 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" event={"ID":"dc55cabb-5e01-4012-a02f-27ee023df0c4","Type":"ContainerStarted","Data":"1171892d6cdade130ee95716b973b98cb62c369c43910e9e6327c158b14bba93"} Sep 30 05:51:09 crc kubenswrapper[4956]: I0930 05:51:09.567657 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" event={"ID":"dc55cabb-5e01-4012-a02f-27ee023df0c4","Type":"ContainerStarted","Data":"dc7460c16841460f04a65917e3cf6ea1be5549b2a5dc62091de9678cfa3042a8"} Sep 30 05:51:09 crc kubenswrapper[4956]: I0930 05:51:09.607323 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" podStartSLOduration=2.138934117 podStartE2EDuration="2.607302653s" podCreationTimestamp="2025-09-30 05:51:07 +0000 UTC" firstStartedPulling="2025-09-30 05:51:08.507821778 +0000 UTC m=+1338.834942303" lastFinishedPulling="2025-09-30 05:51:08.976190314 +0000 UTC m=+1339.303310839" observedRunningTime="2025-09-30 05:51:09.594819841 +0000 UTC m=+1339.921940396" watchObservedRunningTime="2025-09-30 05:51:09.607302653 +0000 UTC m=+1339.934423178" Sep 30 05:51:11 crc kubenswrapper[4956]: I0930 05:51:11.384453 4956 scope.go:117] "RemoveContainer" containerID="509e3037099b2a742bd50bc91ebd9ee374dec1310ab499be8d8ba222635ab076" Sep 30 05:51:11 crc kubenswrapper[4956]: I0930 05:51:11.415974 4956 scope.go:117] "RemoveContainer" containerID="dcfcd45f7e03ea0e088ebb12d6de6cdba3c6ceeb13717333758e38d327cab22a" Sep 30 05:51:11 crc kubenswrapper[4956]: I0930 05:51:11.465915 4956 scope.go:117] "RemoveContainer" containerID="02a32a5345e4055e251f2625d2879b5c43b7bad65df838c3cf1eb9a3f53a1c82" Sep 30 05:51:11 crc kubenswrapper[4956]: I0930 05:51:11.509109 4956 scope.go:117] "RemoveContainer" containerID="8ee40a03262790de3fbb7f3871a285f616d4b453cac08f4902675e2c34bc2937" Sep 30 05:52:11 crc kubenswrapper[4956]: I0930 05:52:11.615234 4956 scope.go:117] "RemoveContainer" containerID="64c657c67da3d57af3ab43ba2cca01b58dbfb7c3a32ad06f048fafe65e2880f9" Sep 30 05:52:11 crc kubenswrapper[4956]: I0930 05:52:11.652598 4956 scope.go:117] "RemoveContainer" containerID="f290fda2699c3420e1a65192c4d5f290468f1f8fa05af961d2f7c175441173e3" Sep 30 05:52:11 crc kubenswrapper[4956]: I0930 05:52:11.677310 4956 scope.go:117] "RemoveContainer" containerID="3e1f243e230666e6cbb921173fd66b7c96619ede87f4641aa5e56ac9001fd8c7" Sep 30 05:52:11 crc kubenswrapper[4956]: I0930 05:52:11.696670 4956 scope.go:117] "RemoveContainer" containerID="b68193af02d30c7f87ae81cf906482510126044a33388b576fe65bfad61a28c3" Sep 30 05:52:11 crc kubenswrapper[4956]: I0930 05:52:11.720187 4956 scope.go:117] "RemoveContainer" containerID="04b84c0b3e87464976e5c29c292c9d76380685b63d170678c4fb2aefa5553328" Sep 30 05:52:18 crc kubenswrapper[4956]: I0930 05:52:18.073248 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:52:18 crc kubenswrapper[4956]: I0930 05:52:18.074417 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.692518 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wcxpr"] Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.696930 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.713002 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcxpr"] Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.754284 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-catalog-content\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.754492 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8sbm\" (UniqueName: \"kubernetes.io/projected/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-kube-api-access-v8sbm\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.754592 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-utilities\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.856917 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8sbm\" (UniqueName: \"kubernetes.io/projected/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-kube-api-access-v8sbm\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.857488 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-utilities\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.857594 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-catalog-content\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.858075 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-utilities\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.858134 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-catalog-content\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:29 crc kubenswrapper[4956]: I0930 05:52:29.894084 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8sbm\" (UniqueName: \"kubernetes.io/projected/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-kube-api-access-v8sbm\") pod \"community-operators-wcxpr\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:30 crc kubenswrapper[4956]: I0930 05:52:30.030733 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:30 crc kubenswrapper[4956]: I0930 05:52:30.541380 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcxpr"] Sep 30 05:52:31 crc kubenswrapper[4956]: I0930 05:52:31.410199 4956 generic.go:334] "Generic (PLEG): container finished" podID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerID="78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a" exitCode=0 Sep 30 05:52:31 crc kubenswrapper[4956]: I0930 05:52:31.410653 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcxpr" event={"ID":"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7","Type":"ContainerDied","Data":"78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a"} Sep 30 05:52:31 crc kubenswrapper[4956]: I0930 05:52:31.410698 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcxpr" event={"ID":"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7","Type":"ContainerStarted","Data":"375b4afccb9d7d7e2250dd62bb4a7863f2ff3ce4be43e8f9d7647ffd6cd2db1f"} Sep 30 05:52:32 crc kubenswrapper[4956]: I0930 05:52:32.422385 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcxpr" event={"ID":"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7","Type":"ContainerStarted","Data":"5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d"} Sep 30 05:52:34 crc kubenswrapper[4956]: I0930 05:52:34.447645 4956 generic.go:334] "Generic (PLEG): container finished" podID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerID="5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d" exitCode=0 Sep 30 05:52:34 crc kubenswrapper[4956]: I0930 05:52:34.447674 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcxpr" event={"ID":"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7","Type":"ContainerDied","Data":"5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d"} Sep 30 05:52:35 crc kubenswrapper[4956]: I0930 05:52:35.462517 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcxpr" event={"ID":"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7","Type":"ContainerStarted","Data":"a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b"} Sep 30 05:52:35 crc kubenswrapper[4956]: I0930 05:52:35.486293 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wcxpr" podStartSLOduration=3.03624324 podStartE2EDuration="6.486273692s" podCreationTimestamp="2025-09-30 05:52:29 +0000 UTC" firstStartedPulling="2025-09-30 05:52:31.413094799 +0000 UTC m=+1421.740215364" lastFinishedPulling="2025-09-30 05:52:34.863125291 +0000 UTC m=+1425.190245816" observedRunningTime="2025-09-30 05:52:35.479903963 +0000 UTC m=+1425.807024508" watchObservedRunningTime="2025-09-30 05:52:35.486273692 +0000 UTC m=+1425.813394217" Sep 30 05:52:40 crc kubenswrapper[4956]: I0930 05:52:40.031899 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:40 crc kubenswrapper[4956]: I0930 05:52:40.032344 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:40 crc kubenswrapper[4956]: I0930 05:52:40.102610 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:40 crc kubenswrapper[4956]: I0930 05:52:40.571655 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:40 crc kubenswrapper[4956]: I0930 05:52:40.626471 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wcxpr"] Sep 30 05:52:42 crc kubenswrapper[4956]: I0930 05:52:42.529733 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wcxpr" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerName="registry-server" containerID="cri-o://a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b" gracePeriod=2 Sep 30 05:52:42 crc kubenswrapper[4956]: I0930 05:52:42.979822 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.118820 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8sbm\" (UniqueName: \"kubernetes.io/projected/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-kube-api-access-v8sbm\") pod \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.118975 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-catalog-content\") pod \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.119101 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-utilities\") pod \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\" (UID: \"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7\") " Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.120573 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-utilities" (OuterVolumeSpecName: "utilities") pod "c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" (UID: "c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.128954 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-kube-api-access-v8sbm" (OuterVolumeSpecName: "kube-api-access-v8sbm") pod "c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" (UID: "c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7"). InnerVolumeSpecName "kube-api-access-v8sbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.172348 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" (UID: "c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.221982 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8sbm\" (UniqueName: \"kubernetes.io/projected/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-kube-api-access-v8sbm\") on node \"crc\" DevicePath \"\"" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.222048 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.222062 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.546480 4956 generic.go:334] "Generic (PLEG): container finished" podID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerID="a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b" exitCode=0 Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.546533 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcxpr" event={"ID":"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7","Type":"ContainerDied","Data":"a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b"} Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.546591 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcxpr" event={"ID":"c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7","Type":"ContainerDied","Data":"375b4afccb9d7d7e2250dd62bb4a7863f2ff3ce4be43e8f9d7647ffd6cd2db1f"} Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.546616 4956 scope.go:117] "RemoveContainer" containerID="a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.546642 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcxpr" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.570384 4956 scope.go:117] "RemoveContainer" containerID="5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.597234 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wcxpr"] Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.598979 4956 scope.go:117] "RemoveContainer" containerID="78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.607877 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wcxpr"] Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.652811 4956 scope.go:117] "RemoveContainer" containerID="a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b" Sep 30 05:52:43 crc kubenswrapper[4956]: E0930 05:52:43.653215 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b\": container with ID starting with a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b not found: ID does not exist" containerID="a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.653253 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b"} err="failed to get container status \"a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b\": rpc error: code = NotFound desc = could not find container \"a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b\": container with ID starting with a4fb0f72e698e3ac57bb52c6fbd1923c722c97a50eda90677a87709d6010686b not found: ID does not exist" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.653277 4956 scope.go:117] "RemoveContainer" containerID="5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d" Sep 30 05:52:43 crc kubenswrapper[4956]: E0930 05:52:43.653568 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d\": container with ID starting with 5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d not found: ID does not exist" containerID="5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.653600 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d"} err="failed to get container status \"5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d\": rpc error: code = NotFound desc = could not find container \"5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d\": container with ID starting with 5133d90e3a54131f050f8ad35118df5e77ac9979acc0a4a66c07e6f1aa25518d not found: ID does not exist" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.653621 4956 scope.go:117] "RemoveContainer" containerID="78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a" Sep 30 05:52:43 crc kubenswrapper[4956]: E0930 05:52:43.653817 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a\": container with ID starting with 78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a not found: ID does not exist" containerID="78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a" Sep 30 05:52:43 crc kubenswrapper[4956]: I0930 05:52:43.653842 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a"} err="failed to get container status \"78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a\": rpc error: code = NotFound desc = could not find container \"78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a\": container with ID starting with 78c36f97ec6ee8fc03b8886a0fb02292dac36beb2e4e758d27a7ce740090404a not found: ID does not exist" Sep 30 05:52:44 crc kubenswrapper[4956]: I0930 05:52:44.355214 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" path="/var/lib/kubelet/pods/c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7/volumes" Sep 30 05:52:48 crc kubenswrapper[4956]: I0930 05:52:48.073506 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:52:48 crc kubenswrapper[4956]: I0930 05:52:48.074947 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.743109 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q2sdb"] Sep 30 05:53:06 crc kubenswrapper[4956]: E0930 05:53:06.744292 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerName="extract-utilities" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.744312 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerName="extract-utilities" Sep 30 05:53:06 crc kubenswrapper[4956]: E0930 05:53:06.744337 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerName="extract-content" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.744345 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerName="extract-content" Sep 30 05:53:06 crc kubenswrapper[4956]: E0930 05:53:06.744367 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerName="registry-server" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.744375 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerName="registry-server" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.744637 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ce374e-9c2f-4eb5-a30e-d1e286cb04d7" containerName="registry-server" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.747716 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.759852 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q2sdb"] Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.850935 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdnfh\" (UniqueName: \"kubernetes.io/projected/e435ba47-7ec5-4c48-80f4-5638536ac786-kube-api-access-pdnfh\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.851131 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-utilities\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.851394 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-catalog-content\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.953040 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-catalog-content\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.953088 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdnfh\" (UniqueName: \"kubernetes.io/projected/e435ba47-7ec5-4c48-80f4-5638536ac786-kube-api-access-pdnfh\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.953160 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-utilities\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.953635 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-utilities\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.953694 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-catalog-content\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:06 crc kubenswrapper[4956]: I0930 05:53:06.977957 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdnfh\" (UniqueName: \"kubernetes.io/projected/e435ba47-7ec5-4c48-80f4-5638536ac786-kube-api-access-pdnfh\") pod \"certified-operators-q2sdb\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:07 crc kubenswrapper[4956]: I0930 05:53:07.074063 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:07 crc kubenswrapper[4956]: I0930 05:53:07.606232 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q2sdb"] Sep 30 05:53:07 crc kubenswrapper[4956]: I0930 05:53:07.793310 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2sdb" event={"ID":"e435ba47-7ec5-4c48-80f4-5638536ac786","Type":"ContainerStarted","Data":"df9ad9d4a8f7f8cd8b5fcd4e613c1e1b8f8969d95cd1bec1513f5022a106806d"} Sep 30 05:53:08 crc kubenswrapper[4956]: I0930 05:53:08.808508 4956 generic.go:334] "Generic (PLEG): container finished" podID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerID="d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402" exitCode=0 Sep 30 05:53:08 crc kubenswrapper[4956]: I0930 05:53:08.808564 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2sdb" event={"ID":"e435ba47-7ec5-4c48-80f4-5638536ac786","Type":"ContainerDied","Data":"d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402"} Sep 30 05:53:09 crc kubenswrapper[4956]: I0930 05:53:09.820382 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2sdb" event={"ID":"e435ba47-7ec5-4c48-80f4-5638536ac786","Type":"ContainerStarted","Data":"60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85"} Sep 30 05:53:10 crc kubenswrapper[4956]: I0930 05:53:10.836148 4956 generic.go:334] "Generic (PLEG): container finished" podID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerID="60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85" exitCode=0 Sep 30 05:53:10 crc kubenswrapper[4956]: I0930 05:53:10.836194 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2sdb" event={"ID":"e435ba47-7ec5-4c48-80f4-5638536ac786","Type":"ContainerDied","Data":"60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85"} Sep 30 05:53:11 crc kubenswrapper[4956]: I0930 05:53:11.848155 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2sdb" event={"ID":"e435ba47-7ec5-4c48-80f4-5638536ac786","Type":"ContainerStarted","Data":"7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304"} Sep 30 05:53:11 crc kubenswrapper[4956]: I0930 05:53:11.881156 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q2sdb" podStartSLOduration=3.433977609 podStartE2EDuration="5.881137546s" podCreationTimestamp="2025-09-30 05:53:06 +0000 UTC" firstStartedPulling="2025-09-30 05:53:08.81089283 +0000 UTC m=+1459.138013365" lastFinishedPulling="2025-09-30 05:53:11.258052777 +0000 UTC m=+1461.585173302" observedRunningTime="2025-09-30 05:53:11.8754851 +0000 UTC m=+1462.202605645" watchObservedRunningTime="2025-09-30 05:53:11.881137546 +0000 UTC m=+1462.208258101" Sep 30 05:53:17 crc kubenswrapper[4956]: I0930 05:53:17.074528 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:17 crc kubenswrapper[4956]: I0930 05:53:17.075206 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:17 crc kubenswrapper[4956]: I0930 05:53:17.133866 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:17 crc kubenswrapper[4956]: I0930 05:53:17.959088 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.015734 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q2sdb"] Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.075213 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.075291 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.075345 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.076426 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.076534 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" gracePeriod=600 Sep 30 05:53:18 crc kubenswrapper[4956]: E0930 05:53:18.200500 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.923704 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" exitCode=0 Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.923769 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b"} Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.924380 4956 scope.go:117] "RemoveContainer" containerID="d6707b9d26600f66963acd8d20bb882ca13db20f72f914fce4d118ce50f76933" Sep 30 05:53:18 crc kubenswrapper[4956]: I0930 05:53:18.925066 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:53:18 crc kubenswrapper[4956]: E0930 05:53:18.925387 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:53:19 crc kubenswrapper[4956]: I0930 05:53:19.936224 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q2sdb" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerName="registry-server" containerID="cri-o://7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304" gracePeriod=2 Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.364151 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.525063 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdnfh\" (UniqueName: \"kubernetes.io/projected/e435ba47-7ec5-4c48-80f4-5638536ac786-kube-api-access-pdnfh\") pod \"e435ba47-7ec5-4c48-80f4-5638536ac786\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.525276 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-utilities\") pod \"e435ba47-7ec5-4c48-80f4-5638536ac786\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.525350 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-catalog-content\") pod \"e435ba47-7ec5-4c48-80f4-5638536ac786\" (UID: \"e435ba47-7ec5-4c48-80f4-5638536ac786\") " Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.526278 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-utilities" (OuterVolumeSpecName: "utilities") pod "e435ba47-7ec5-4c48-80f4-5638536ac786" (UID: "e435ba47-7ec5-4c48-80f4-5638536ac786"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.532331 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e435ba47-7ec5-4c48-80f4-5638536ac786-kube-api-access-pdnfh" (OuterVolumeSpecName: "kube-api-access-pdnfh") pod "e435ba47-7ec5-4c48-80f4-5638536ac786" (UID: "e435ba47-7ec5-4c48-80f4-5638536ac786"). InnerVolumeSpecName "kube-api-access-pdnfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.591221 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e435ba47-7ec5-4c48-80f4-5638536ac786" (UID: "e435ba47-7ec5-4c48-80f4-5638536ac786"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.628375 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.628412 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e435ba47-7ec5-4c48-80f4-5638536ac786-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.628430 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdnfh\" (UniqueName: \"kubernetes.io/projected/e435ba47-7ec5-4c48-80f4-5638536ac786-kube-api-access-pdnfh\") on node \"crc\" DevicePath \"\"" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.945997 4956 generic.go:334] "Generic (PLEG): container finished" podID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerID="7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304" exitCode=0 Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.946050 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2sdb" event={"ID":"e435ba47-7ec5-4c48-80f4-5638536ac786","Type":"ContainerDied","Data":"7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304"} Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.946366 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q2sdb" event={"ID":"e435ba47-7ec5-4c48-80f4-5638536ac786","Type":"ContainerDied","Data":"df9ad9d4a8f7f8cd8b5fcd4e613c1e1b8f8969d95cd1bec1513f5022a106806d"} Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.946391 4956 scope.go:117] "RemoveContainer" containerID="7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.946142 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q2sdb" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.974748 4956 scope.go:117] "RemoveContainer" containerID="60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85" Sep 30 05:53:20 crc kubenswrapper[4956]: I0930 05:53:20.991796 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q2sdb"] Sep 30 05:53:21 crc kubenswrapper[4956]: I0930 05:53:21.000996 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q2sdb"] Sep 30 05:53:21 crc kubenswrapper[4956]: I0930 05:53:21.006728 4956 scope.go:117] "RemoveContainer" containerID="d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402" Sep 30 05:53:21 crc kubenswrapper[4956]: I0930 05:53:21.062334 4956 scope.go:117] "RemoveContainer" containerID="7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304" Sep 30 05:53:21 crc kubenswrapper[4956]: E0930 05:53:21.062823 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304\": container with ID starting with 7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304 not found: ID does not exist" containerID="7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304" Sep 30 05:53:21 crc kubenswrapper[4956]: I0930 05:53:21.062863 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304"} err="failed to get container status \"7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304\": rpc error: code = NotFound desc = could not find container \"7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304\": container with ID starting with 7da37bcd20fa139efcc6e3564f162d02d5ccd53cb62af18eca54bd74ac27a304 not found: ID does not exist" Sep 30 05:53:21 crc kubenswrapper[4956]: I0930 05:53:21.062891 4956 scope.go:117] "RemoveContainer" containerID="60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85" Sep 30 05:53:21 crc kubenswrapper[4956]: E0930 05:53:21.063161 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85\": container with ID starting with 60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85 not found: ID does not exist" containerID="60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85" Sep 30 05:53:21 crc kubenswrapper[4956]: I0930 05:53:21.063193 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85"} err="failed to get container status \"60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85\": rpc error: code = NotFound desc = could not find container \"60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85\": container with ID starting with 60ff70b1edacdda1f54f17fb7b2ca0562ff94721421ac755f36ad0b8f06e4e85 not found: ID does not exist" Sep 30 05:53:21 crc kubenswrapper[4956]: I0930 05:53:21.063213 4956 scope.go:117] "RemoveContainer" containerID="d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402" Sep 30 05:53:21 crc kubenswrapper[4956]: E0930 05:53:21.063519 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402\": container with ID starting with d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402 not found: ID does not exist" containerID="d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402" Sep 30 05:53:21 crc kubenswrapper[4956]: I0930 05:53:21.063549 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402"} err="failed to get container status \"d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402\": rpc error: code = NotFound desc = could not find container \"d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402\": container with ID starting with d6bb9e53989748beb5b5d096ef9524ce2c90ca803df0fa11db328287366a7402 not found: ID does not exist" Sep 30 05:53:22 crc kubenswrapper[4956]: I0930 05:53:22.361972 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" path="/var/lib/kubelet/pods/e435ba47-7ec5-4c48-80f4-5638536ac786/volumes" Sep 30 05:53:29 crc kubenswrapper[4956]: I0930 05:53:29.340797 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:53:29 crc kubenswrapper[4956]: E0930 05:53:29.341646 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:53:40 crc kubenswrapper[4956]: I0930 05:53:40.349597 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:53:40 crc kubenswrapper[4956]: E0930 05:53:40.350389 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:53:53 crc kubenswrapper[4956]: I0930 05:53:53.341324 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:53:53 crc kubenswrapper[4956]: E0930 05:53:53.342128 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:54:04 crc kubenswrapper[4956]: I0930 05:54:04.341444 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:54:04 crc kubenswrapper[4956]: E0930 05:54:04.342334 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:54:18 crc kubenswrapper[4956]: I0930 05:54:18.341228 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:54:18 crc kubenswrapper[4956]: E0930 05:54:18.343658 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:54:20 crc kubenswrapper[4956]: I0930 05:54:20.038867 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-qm5wt"] Sep 30 05:54:20 crc kubenswrapper[4956]: I0930 05:54:20.049897 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-qm5wt"] Sep 30 05:54:20 crc kubenswrapper[4956]: I0930 05:54:20.350849 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a30fe5e-a789-4251-87bb-bba57356337a" path="/var/lib/kubelet/pods/5a30fe5e-a789-4251-87bb-bba57356337a/volumes" Sep 30 05:54:21 crc kubenswrapper[4956]: I0930 05:54:21.537765 4956 generic.go:334] "Generic (PLEG): container finished" podID="dc55cabb-5e01-4012-a02f-27ee023df0c4" containerID="dc7460c16841460f04a65917e3cf6ea1be5549b2a5dc62091de9678cfa3042a8" exitCode=0 Sep 30 05:54:21 crc kubenswrapper[4956]: I0930 05:54:21.537902 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" event={"ID":"dc55cabb-5e01-4012-a02f-27ee023df0c4","Type":"ContainerDied","Data":"dc7460c16841460f04a65917e3cf6ea1be5549b2a5dc62091de9678cfa3042a8"} Sep 30 05:54:22 crc kubenswrapper[4956]: I0930 05:54:22.946737 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.047061 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-inventory\") pod \"dc55cabb-5e01-4012-a02f-27ee023df0c4\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.047236 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjq9d\" (UniqueName: \"kubernetes.io/projected/dc55cabb-5e01-4012-a02f-27ee023df0c4-kube-api-access-tjq9d\") pod \"dc55cabb-5e01-4012-a02f-27ee023df0c4\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.047373 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-ssh-key\") pod \"dc55cabb-5e01-4012-a02f-27ee023df0c4\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.047400 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-bootstrap-combined-ca-bundle\") pod \"dc55cabb-5e01-4012-a02f-27ee023df0c4\" (UID: \"dc55cabb-5e01-4012-a02f-27ee023df0c4\") " Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.053780 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc55cabb-5e01-4012-a02f-27ee023df0c4-kube-api-access-tjq9d" (OuterVolumeSpecName: "kube-api-access-tjq9d") pod "dc55cabb-5e01-4012-a02f-27ee023df0c4" (UID: "dc55cabb-5e01-4012-a02f-27ee023df0c4"). InnerVolumeSpecName "kube-api-access-tjq9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.053878 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "dc55cabb-5e01-4012-a02f-27ee023df0c4" (UID: "dc55cabb-5e01-4012-a02f-27ee023df0c4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.079916 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-inventory" (OuterVolumeSpecName: "inventory") pod "dc55cabb-5e01-4012-a02f-27ee023df0c4" (UID: "dc55cabb-5e01-4012-a02f-27ee023df0c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.080469 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc55cabb-5e01-4012-a02f-27ee023df0c4" (UID: "dc55cabb-5e01-4012-a02f-27ee023df0c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.149745 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.149791 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjq9d\" (UniqueName: \"kubernetes.io/projected/dc55cabb-5e01-4012-a02f-27ee023df0c4-kube-api-access-tjq9d\") on node \"crc\" DevicePath \"\"" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.149807 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.149817 4956 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55cabb-5e01-4012-a02f-27ee023df0c4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.594780 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" event={"ID":"dc55cabb-5e01-4012-a02f-27ee023df0c4","Type":"ContainerDied","Data":"1171892d6cdade130ee95716b973b98cb62c369c43910e9e6327c158b14bba93"} Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.595057 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1171892d6cdade130ee95716b973b98cb62c369c43910e9e6327c158b14bba93" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.594915 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.663721 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6"] Sep 30 05:54:23 crc kubenswrapper[4956]: E0930 05:54:23.664192 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerName="extract-content" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.664208 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerName="extract-content" Sep 30 05:54:23 crc kubenswrapper[4956]: E0930 05:54:23.664219 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerName="registry-server" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.664226 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerName="registry-server" Sep 30 05:54:23 crc kubenswrapper[4956]: E0930 05:54:23.664242 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc55cabb-5e01-4012-a02f-27ee023df0c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.664249 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc55cabb-5e01-4012-a02f-27ee023df0c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 05:54:23 crc kubenswrapper[4956]: E0930 05:54:23.664279 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerName="extract-utilities" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.664285 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerName="extract-utilities" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.664474 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e435ba47-7ec5-4c48-80f4-5638536ac786" containerName="registry-server" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.664497 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc55cabb-5e01-4012-a02f-27ee023df0c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.665217 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.667460 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.667745 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.667993 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.668203 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.678897 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6"] Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.786088 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.786247 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l495d\" (UniqueName: \"kubernetes.io/projected/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-kube-api-access-l495d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.786723 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.889338 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.889397 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l495d\" (UniqueName: \"kubernetes.io/projected/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-kube-api-access-l495d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.889510 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.894609 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.903013 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.912660 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l495d\" (UniqueName: \"kubernetes.io/projected/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-kube-api-access-l495d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:23 crc kubenswrapper[4956]: I0930 05:54:23.994083 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:54:24 crc kubenswrapper[4956]: I0930 05:54:24.523695 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6"] Sep 30 05:54:24 crc kubenswrapper[4956]: I0930 05:54:24.529172 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 05:54:24 crc kubenswrapper[4956]: I0930 05:54:24.604577 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" event={"ID":"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa","Type":"ContainerStarted","Data":"bc21360b6ee527e510445b334b0b6fc35ae8cd511fb9a0f39947921a0bca9fe6"} Sep 30 05:54:25 crc kubenswrapper[4956]: I0930 05:54:25.615149 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" event={"ID":"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa","Type":"ContainerStarted","Data":"1ca885a1092ae3c194cb169f13ff5c1f131fed742fbc7fef4d5594d30a82f859"} Sep 30 05:54:25 crc kubenswrapper[4956]: I0930 05:54:25.635726 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" podStartSLOduration=2.156860448 podStartE2EDuration="2.635699592s" podCreationTimestamp="2025-09-30 05:54:23 +0000 UTC" firstStartedPulling="2025-09-30 05:54:24.528930243 +0000 UTC m=+1534.856050768" lastFinishedPulling="2025-09-30 05:54:25.007769377 +0000 UTC m=+1535.334889912" observedRunningTime="2025-09-30 05:54:25.627270847 +0000 UTC m=+1535.954391372" watchObservedRunningTime="2025-09-30 05:54:25.635699592 +0000 UTC m=+1535.962820157" Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.032629 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pspqn"] Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.046827 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pjvh9"] Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.055714 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-z6kts"] Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.064256 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pjvh9"] Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.072556 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pspqn"] Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.085422 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-z6kts"] Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.357788 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417d8763-c01b-4a78-aabe-76229bf38f79" path="/var/lib/kubelet/pods/417d8763-c01b-4a78-aabe-76229bf38f79/volumes" Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.358969 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c03935-f3af-4925-8dcc-d4b6c6906cf0" path="/var/lib/kubelet/pods/47c03935-f3af-4925-8dcc-d4b6c6906cf0/volumes" Sep 30 05:54:28 crc kubenswrapper[4956]: I0930 05:54:28.359475 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d89a6444-ef65-4aeb-ba5f-cce1b00ae461" path="/var/lib/kubelet/pods/d89a6444-ef65-4aeb-ba5f-cce1b00ae461/volumes" Sep 30 05:54:29 crc kubenswrapper[4956]: I0930 05:54:29.341149 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:54:29 crc kubenswrapper[4956]: E0930 05:54:29.341492 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:54:30 crc kubenswrapper[4956]: I0930 05:54:30.050420 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-ac03-account-create-dzz6s"] Sep 30 05:54:30 crc kubenswrapper[4956]: I0930 05:54:30.068675 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-ac03-account-create-dzz6s"] Sep 30 05:54:30 crc kubenswrapper[4956]: I0930 05:54:30.353037 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3549a7-c08c-4c47-a587-9a9672ef54f7" path="/var/lib/kubelet/pods/5d3549a7-c08c-4c47-a587-9a9672ef54f7/volumes" Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.053407 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5dab-account-create-crhmx"] Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.069086 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a5a5-account-create-4v8x2"] Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.082882 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f878-account-create-cs4z9"] Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.098793 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a5a5-account-create-4v8x2"] Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.107732 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5dab-account-create-crhmx"] Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.116736 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f878-account-create-cs4z9"] Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.353775 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d1987b5-707c-4084-a639-b626742fb1a3" path="/var/lib/kubelet/pods/1d1987b5-707c-4084-a639-b626742fb1a3/volumes" Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.355254 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4234d793-0813-4224-8b30-754326f3a4e2" path="/var/lib/kubelet/pods/4234d793-0813-4224-8b30-754326f3a4e2/volumes" Sep 30 05:54:38 crc kubenswrapper[4956]: I0930 05:54:38.356491 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78cea6c-ba5c-4c6f-ace8-b815445520ed" path="/var/lib/kubelet/pods/b78cea6c-ba5c-4c6f-ace8-b815445520ed/volumes" Sep 30 05:54:42 crc kubenswrapper[4956]: I0930 05:54:42.341203 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:54:42 crc kubenswrapper[4956]: E0930 05:54:42.341750 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:54:56 crc kubenswrapper[4956]: I0930 05:54:56.342232 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:54:56 crc kubenswrapper[4956]: E0930 05:54:56.344373 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:55:05 crc kubenswrapper[4956]: I0930 05:55:05.057919 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bk28v"] Sep 30 05:55:05 crc kubenswrapper[4956]: I0930 05:55:05.066207 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bk28v"] Sep 30 05:55:06 crc kubenswrapper[4956]: I0930 05:55:06.361071 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b14fc082-107c-44e4-b2c2-314d9b8010c7" path="/var/lib/kubelet/pods/b14fc082-107c-44e4-b2c2-314d9b8010c7/volumes" Sep 30 05:55:07 crc kubenswrapper[4956]: I0930 05:55:07.027701 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-c75th"] Sep 30 05:55:07 crc kubenswrapper[4956]: I0930 05:55:07.039855 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ks9m6"] Sep 30 05:55:07 crc kubenswrapper[4956]: I0930 05:55:07.047403 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-c75th"] Sep 30 05:55:07 crc kubenswrapper[4956]: I0930 05:55:07.057385 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ks9m6"] Sep 30 05:55:08 crc kubenswrapper[4956]: I0930 05:55:08.341650 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:55:08 crc kubenswrapper[4956]: E0930 05:55:08.342354 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:55:08 crc kubenswrapper[4956]: I0930 05:55:08.360709 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc066a4f-bea1-47c4-96c4-f3dcda1a930d" path="/var/lib/kubelet/pods/dc066a4f-bea1-47c4-96c4-f3dcda1a930d/volumes" Sep 30 05:55:08 crc kubenswrapper[4956]: I0930 05:55:08.362405 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f51c959c-848b-46f3-a128-1604dd5fc435" path="/var/lib/kubelet/pods/f51c959c-848b-46f3-a128-1604dd5fc435/volumes" Sep 30 05:55:10 crc kubenswrapper[4956]: I0930 05:55:10.040240 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6sh8d"] Sep 30 05:55:10 crc kubenswrapper[4956]: I0930 05:55:10.052825 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6sh8d"] Sep 30 05:55:10 crc kubenswrapper[4956]: I0930 05:55:10.353852 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ff0cad1-ad65-4838-b4d7-b43974fbe477" path="/var/lib/kubelet/pods/0ff0cad1-ad65-4838-b4d7-b43974fbe477/volumes" Sep 30 05:55:11 crc kubenswrapper[4956]: I0930 05:55:11.901460 4956 scope.go:117] "RemoveContainer" containerID="a9fea60df5a048553fc067fbaf2628453ccdd923dd95567a529f794d75430890" Sep 30 05:55:11 crc kubenswrapper[4956]: I0930 05:55:11.935318 4956 scope.go:117] "RemoveContainer" containerID="aa9296ad7dbe9f24f36a580c8f290af620b406ededd5aa936b8787092bf8da70" Sep 30 05:55:11 crc kubenswrapper[4956]: I0930 05:55:11.978669 4956 scope.go:117] "RemoveContainer" containerID="2301fbabd5b9cc312adc9c880f268328ff09ef0b14a3c14a205a007ec8b3a3b7" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.028635 4956 scope.go:117] "RemoveContainer" containerID="5e82cb4d8be83def87bd14d177e7a5d18f58cb4811029d7ffe387a8978e1070b" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.079197 4956 scope.go:117] "RemoveContainer" containerID="5eddfba933b9bac2e24149a63bae92b7951a443550ba8aaf8b3374dd4853caf1" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.118532 4956 scope.go:117] "RemoveContainer" containerID="6db3d0166ba7a8f7de8a258c636eecb483ebfa0b0bdfd0727ca2b4772507c5c2" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.164951 4956 scope.go:117] "RemoveContainer" containerID="b3507a67deaa2910720a6fbbca070c38ed9a2ec79e7100f4aa6a081510852367" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.190301 4956 scope.go:117] "RemoveContainer" containerID="0caea1a43ed4362bbcff0cb4946f5fe971ff1378fb1c48fd54c8129c5feabb23" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.212858 4956 scope.go:117] "RemoveContainer" containerID="28fc84e7e1f8da81e6de8ac31305b2590c17a3be728b641677bdbea2830e17fd" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.244697 4956 scope.go:117] "RemoveContainer" containerID="e35bfb8b1e3edbe9c7812f2407e9c810d0b32bcb5dd01a7a4d3fabb9d064bf18" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.265630 4956 scope.go:117] "RemoveContainer" containerID="5f6f01c705c6103866b94d4b909e757392136708c6914d11166f55f72f036861" Sep 30 05:55:12 crc kubenswrapper[4956]: I0930 05:55:12.288775 4956 scope.go:117] "RemoveContainer" containerID="2057be081c11543283d04c30ea09d2959e6322cd9857be78d2086c6c1800af53" Sep 30 05:55:16 crc kubenswrapper[4956]: I0930 05:55:16.032417 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-tkh7q"] Sep 30 05:55:16 crc kubenswrapper[4956]: I0930 05:55:16.048856 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-tkh7q"] Sep 30 05:55:16 crc kubenswrapper[4956]: I0930 05:55:16.353102 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="826ffd10-134e-452b-93de-3a9bc469d44d" path="/var/lib/kubelet/pods/826ffd10-134e-452b-93de-3a9bc469d44d/volumes" Sep 30 05:55:18 crc kubenswrapper[4956]: I0930 05:55:18.028043 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-588e-account-create-9zfn6"] Sep 30 05:55:18 crc kubenswrapper[4956]: I0930 05:55:18.038796 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-588e-account-create-9zfn6"] Sep 30 05:55:18 crc kubenswrapper[4956]: I0930 05:55:18.364829 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfef647-732a-43ff-ad7d-59b6f1eb1f2a" path="/var/lib/kubelet/pods/5cfef647-732a-43ff-ad7d-59b6f1eb1f2a/volumes" Sep 30 05:55:19 crc kubenswrapper[4956]: I0930 05:55:19.080821 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-24d8t"] Sep 30 05:55:19 crc kubenswrapper[4956]: I0930 05:55:19.087964 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-24d8t"] Sep 30 05:55:19 crc kubenswrapper[4956]: I0930 05:55:19.342064 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:55:19 crc kubenswrapper[4956]: E0930 05:55:19.342424 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:55:20 crc kubenswrapper[4956]: I0930 05:55:20.358204 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9dd57e-7978-4d30-b9ec-a3894e9d67e9" path="/var/lib/kubelet/pods/de9dd57e-7978-4d30-b9ec-a3894e9d67e9/volumes" Sep 30 05:55:30 crc kubenswrapper[4956]: I0930 05:55:30.349682 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:55:30 crc kubenswrapper[4956]: E0930 05:55:30.350995 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:55:31 crc kubenswrapper[4956]: I0930 05:55:31.033070 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a921-account-create-q4xck"] Sep 30 05:55:31 crc kubenswrapper[4956]: I0930 05:55:31.052888 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3e76-account-create-gl5vp"] Sep 30 05:55:31 crc kubenswrapper[4956]: I0930 05:55:31.067351 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a921-account-create-q4xck"] Sep 30 05:55:31 crc kubenswrapper[4956]: I0930 05:55:31.078822 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3e76-account-create-gl5vp"] Sep 30 05:55:32 crc kubenswrapper[4956]: I0930 05:55:32.353893 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6488fac1-2ea4-4c60-bb3c-8014ec1e8149" path="/var/lib/kubelet/pods/6488fac1-2ea4-4c60-bb3c-8014ec1e8149/volumes" Sep 30 05:55:32 crc kubenswrapper[4956]: I0930 05:55:32.354553 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbaf681-c7d0-4217-be5c-1a15e5e36786" path="/var/lib/kubelet/pods/8fbaf681-c7d0-4217-be5c-1a15e5e36786/volumes" Sep 30 05:55:45 crc kubenswrapper[4956]: I0930 05:55:45.341835 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:55:45 crc kubenswrapper[4956]: E0930 05:55:45.344630 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:55:50 crc kubenswrapper[4956]: I0930 05:55:50.052357 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t7llp"] Sep 30 05:55:50 crc kubenswrapper[4956]: I0930 05:55:50.061875 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t7llp"] Sep 30 05:55:50 crc kubenswrapper[4956]: I0930 05:55:50.356580 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b07e53-1ac0-4958-8ce3-b43ed35fbdef" path="/var/lib/kubelet/pods/c1b07e53-1ac0-4958-8ce3-b43ed35fbdef/volumes" Sep 30 05:55:51 crc kubenswrapper[4956]: I0930 05:55:51.028249 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dqs9m"] Sep 30 05:55:51 crc kubenswrapper[4956]: I0930 05:55:51.035627 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dqs9m"] Sep 30 05:55:52 crc kubenswrapper[4956]: I0930 05:55:52.380095 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c94959-3c58-4a09-b7ee-bb13a3f82fb5" path="/var/lib/kubelet/pods/a6c94959-3c58-4a09-b7ee-bb13a3f82fb5/volumes" Sep 30 05:55:57 crc kubenswrapper[4956]: I0930 05:55:57.341449 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:55:57 crc kubenswrapper[4956]: E0930 05:55:57.342277 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:56:08 crc kubenswrapper[4956]: I0930 05:56:08.040367 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cd8q4"] Sep 30 05:56:08 crc kubenswrapper[4956]: I0930 05:56:08.052648 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cd8q4"] Sep 30 05:56:08 crc kubenswrapper[4956]: I0930 05:56:08.351331 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d947526f-907a-4951-bbc4-51e29c560a06" path="/var/lib/kubelet/pods/d947526f-907a-4951-bbc4-51e29c560a06/volumes" Sep 30 05:56:11 crc kubenswrapper[4956]: I0930 05:56:11.052484 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hvzfv"] Sep 30 05:56:11 crc kubenswrapper[4956]: I0930 05:56:11.059518 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hvzfv"] Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.341852 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:56:12 crc kubenswrapper[4956]: E0930 05:56:12.342622 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.355664 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d95941-d95f-4302-84f1-9230a7b9001f" path="/var/lib/kubelet/pods/d7d95941-d95f-4302-84f1-9230a7b9001f/volumes" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.585507 4956 scope.go:117] "RemoveContainer" containerID="de0db8d3f713cb626f0e9868ee39ba9e2ca3744a472c2f70318349ecca3f4c47" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.627730 4956 scope.go:117] "RemoveContainer" containerID="ec2d7b018b094efac24191b05f52c1eed8ac1b7d18057ad473bffafb88b0a3ca" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.687368 4956 scope.go:117] "RemoveContainer" containerID="02008c1544ef92e2caf961213b92835f8fd219bfbdf2cceed525a7d78c6ac0cc" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.742004 4956 scope.go:117] "RemoveContainer" containerID="a19ec9f1292a11a446422a9d1f7a9f37771e9154f9e3722a35d8537ac0a6a2d1" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.776345 4956 scope.go:117] "RemoveContainer" containerID="60c9efbf05be74f8df8e8f8e01f3abcc4fc69208616b8113d59a7436b44ac4da" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.831172 4956 scope.go:117] "RemoveContainer" containerID="60d50e8397fd199fc2a8700196daf630ff9b71e0e7bc2f21f6d16fcfc823b648" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.887835 4956 scope.go:117] "RemoveContainer" containerID="81798e0d8072235339857c798c8b6e1b6fafc5fd4d1819afa49e16da10d22f6c" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.915614 4956 scope.go:117] "RemoveContainer" containerID="50fdd61af367195757809d530dd915594a5c82ae555ee93f1467646b253bcd38" Sep 30 05:56:12 crc kubenswrapper[4956]: I0930 05:56:12.935903 4956 scope.go:117] "RemoveContainer" containerID="18af1d264db565f9694f2705f2a0d5db22f030f2f86aff76c63caaa9f8edecde" Sep 30 05:56:24 crc kubenswrapper[4956]: I0930 05:56:24.340917 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:56:24 crc kubenswrapper[4956]: E0930 05:56:24.341948 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:56:25 crc kubenswrapper[4956]: I0930 05:56:25.878853 4956 generic.go:334] "Generic (PLEG): container finished" podID="6f2a0586-6940-498b-8eaa-1f16bf0ea2fa" containerID="1ca885a1092ae3c194cb169f13ff5c1f131fed742fbc7fef4d5594d30a82f859" exitCode=0 Sep 30 05:56:25 crc kubenswrapper[4956]: I0930 05:56:25.878971 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" event={"ID":"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa","Type":"ContainerDied","Data":"1ca885a1092ae3c194cb169f13ff5c1f131fed742fbc7fef4d5594d30a82f859"} Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.322810 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.518446 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-ssh-key\") pod \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.518595 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-inventory\") pod \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.518626 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l495d\" (UniqueName: \"kubernetes.io/projected/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-kube-api-access-l495d\") pod \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\" (UID: \"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa\") " Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.524317 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-kube-api-access-l495d" (OuterVolumeSpecName: "kube-api-access-l495d") pod "6f2a0586-6940-498b-8eaa-1f16bf0ea2fa" (UID: "6f2a0586-6940-498b-8eaa-1f16bf0ea2fa"). InnerVolumeSpecName "kube-api-access-l495d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.544829 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-inventory" (OuterVolumeSpecName: "inventory") pod "6f2a0586-6940-498b-8eaa-1f16bf0ea2fa" (UID: "6f2a0586-6940-498b-8eaa-1f16bf0ea2fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.545822 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6f2a0586-6940-498b-8eaa-1f16bf0ea2fa" (UID: "6f2a0586-6940-498b-8eaa-1f16bf0ea2fa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.621018 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.621049 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.621061 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l495d\" (UniqueName: \"kubernetes.io/projected/6f2a0586-6940-498b-8eaa-1f16bf0ea2fa-kube-api-access-l495d\") on node \"crc\" DevicePath \"\"" Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.906649 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" event={"ID":"6f2a0586-6940-498b-8eaa-1f16bf0ea2fa","Type":"ContainerDied","Data":"bc21360b6ee527e510445b334b0b6fc35ae8cd511fb9a0f39947921a0bca9fe6"} Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.906729 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc21360b6ee527e510445b334b0b6fc35ae8cd511fb9a0f39947921a0bca9fe6" Sep 30 05:56:27 crc kubenswrapper[4956]: I0930 05:56:27.906890 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.003224 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm"] Sep 30 05:56:28 crc kubenswrapper[4956]: E0930 05:56:28.003700 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2a0586-6940-498b-8eaa-1f16bf0ea2fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.003717 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2a0586-6940-498b-8eaa-1f16bf0ea2fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.003956 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f2a0586-6940-498b-8eaa-1f16bf0ea2fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.004655 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.006692 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.007186 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.007315 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.007318 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.023657 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm"] Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.045632 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxlrq\" (UniqueName: \"kubernetes.io/projected/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-kube-api-access-kxlrq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.045698 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.045745 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.148026 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.148225 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxlrq\" (UniqueName: \"kubernetes.io/projected/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-kube-api-access-kxlrq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.148253 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.151470 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.159666 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.173674 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxlrq\" (UniqueName: \"kubernetes.io/projected/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-kube-api-access-kxlrq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.321309 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.823491 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm"] Sep 30 05:56:28 crc kubenswrapper[4956]: I0930 05:56:28.916193 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" event={"ID":"7577cb07-1bcb-4432-a6e8-f57c1f1b2421","Type":"ContainerStarted","Data":"d5c3875b0bedb5d411970a5d9e0e358ec1e94e37f4286c75ddc536cd7a826938"} Sep 30 05:56:29 crc kubenswrapper[4956]: I0930 05:56:29.925688 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" event={"ID":"7577cb07-1bcb-4432-a6e8-f57c1f1b2421","Type":"ContainerStarted","Data":"d2779da15f3f1233557de64d53a8837b74a137453471d97e33ae3d64c9cfaeaf"} Sep 30 05:56:29 crc kubenswrapper[4956]: I0930 05:56:29.941733 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" podStartSLOduration=2.481467635 podStartE2EDuration="2.941717811s" podCreationTimestamp="2025-09-30 05:56:27 +0000 UTC" firstStartedPulling="2025-09-30 05:56:28.829549985 +0000 UTC m=+1659.156670510" lastFinishedPulling="2025-09-30 05:56:29.289800171 +0000 UTC m=+1659.616920686" observedRunningTime="2025-09-30 05:56:29.941234045 +0000 UTC m=+1660.268354590" watchObservedRunningTime="2025-09-30 05:56:29.941717811 +0000 UTC m=+1660.268838336" Sep 30 05:56:38 crc kubenswrapper[4956]: I0930 05:56:38.341072 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:56:38 crc kubenswrapper[4956]: E0930 05:56:38.341745 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:56:39 crc kubenswrapper[4956]: I0930 05:56:39.052683 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nnr6k"] Sep 30 05:56:39 crc kubenswrapper[4956]: I0930 05:56:39.066109 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nnr6k"] Sep 30 05:56:40 crc kubenswrapper[4956]: I0930 05:56:40.353547 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740de0e5-c3e9-43bc-bb01-8c240f50070e" path="/var/lib/kubelet/pods/740de0e5-c3e9-43bc-bb01-8c240f50070e/volumes" Sep 30 05:56:42 crc kubenswrapper[4956]: I0930 05:56:42.796873 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wgnws"] Sep 30 05:56:42 crc kubenswrapper[4956]: I0930 05:56:42.808533 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgnws"] Sep 30 05:56:42 crc kubenswrapper[4956]: I0930 05:56:42.808621 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:42 crc kubenswrapper[4956]: I0930 05:56:42.939577 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-catalog-content\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:42 crc kubenswrapper[4956]: I0930 05:56:42.939936 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-utilities\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:42 crc kubenswrapper[4956]: I0930 05:56:42.940020 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l25l\" (UniqueName: \"kubernetes.io/projected/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-kube-api-access-2l25l\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:43 crc kubenswrapper[4956]: I0930 05:56:43.041933 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-catalog-content\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:43 crc kubenswrapper[4956]: I0930 05:56:43.042057 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-utilities\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:43 crc kubenswrapper[4956]: I0930 05:56:43.042156 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l25l\" (UniqueName: \"kubernetes.io/projected/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-kube-api-access-2l25l\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:43 crc kubenswrapper[4956]: I0930 05:56:43.042569 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-catalog-content\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:43 crc kubenswrapper[4956]: I0930 05:56:43.042815 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-utilities\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:43 crc kubenswrapper[4956]: I0930 05:56:43.068182 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l25l\" (UniqueName: \"kubernetes.io/projected/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-kube-api-access-2l25l\") pod \"redhat-marketplace-wgnws\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:43 crc kubenswrapper[4956]: I0930 05:56:43.172756 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:43 crc kubenswrapper[4956]: I0930 05:56:43.656725 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgnws"] Sep 30 05:56:43 crc kubenswrapper[4956]: W0930 05:56:43.664928 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0e64ece_c9db_4c3d_ac9e_a9e300f76a5c.slice/crio-3ce9df436491b40250d94fa499f7209ad7805ae955797243a84803cca16026de WatchSource:0}: Error finding container 3ce9df436491b40250d94fa499f7209ad7805ae955797243a84803cca16026de: Status 404 returned error can't find the container with id 3ce9df436491b40250d94fa499f7209ad7805ae955797243a84803cca16026de Sep 30 05:56:44 crc kubenswrapper[4956]: I0930 05:56:44.065148 4956 generic.go:334] "Generic (PLEG): container finished" podID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerID="66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61" exitCode=0 Sep 30 05:56:44 crc kubenswrapper[4956]: I0930 05:56:44.065202 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgnws" event={"ID":"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c","Type":"ContainerDied","Data":"66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61"} Sep 30 05:56:44 crc kubenswrapper[4956]: I0930 05:56:44.065236 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgnws" event={"ID":"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c","Type":"ContainerStarted","Data":"3ce9df436491b40250d94fa499f7209ad7805ae955797243a84803cca16026de"} Sep 30 05:56:45 crc kubenswrapper[4956]: I0930 05:56:45.076810 4956 generic.go:334] "Generic (PLEG): container finished" podID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerID="21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc" exitCode=0 Sep 30 05:56:45 crc kubenswrapper[4956]: I0930 05:56:45.077017 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgnws" event={"ID":"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c","Type":"ContainerDied","Data":"21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc"} Sep 30 05:56:46 crc kubenswrapper[4956]: I0930 05:56:46.088244 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgnws" event={"ID":"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c","Type":"ContainerStarted","Data":"02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11"} Sep 30 05:56:46 crc kubenswrapper[4956]: I0930 05:56:46.107496 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wgnws" podStartSLOduration=2.70224406 podStartE2EDuration="4.107475439s" podCreationTimestamp="2025-09-30 05:56:42 +0000 UTC" firstStartedPulling="2025-09-30 05:56:44.067161629 +0000 UTC m=+1674.394282154" lastFinishedPulling="2025-09-30 05:56:45.472393008 +0000 UTC m=+1675.799513533" observedRunningTime="2025-09-30 05:56:46.104546658 +0000 UTC m=+1676.431667183" watchObservedRunningTime="2025-09-30 05:56:46.107475439 +0000 UTC m=+1676.434595964" Sep 30 05:56:53 crc kubenswrapper[4956]: I0930 05:56:53.173120 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:53 crc kubenswrapper[4956]: I0930 05:56:53.173868 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:53 crc kubenswrapper[4956]: I0930 05:56:53.228233 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:53 crc kubenswrapper[4956]: I0930 05:56:53.341070 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:56:53 crc kubenswrapper[4956]: E0930 05:56:53.341374 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:56:54 crc kubenswrapper[4956]: I0930 05:56:54.218778 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:54 crc kubenswrapper[4956]: I0930 05:56:54.304322 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgnws"] Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.185231 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wgnws" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerName="registry-server" containerID="cri-o://02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11" gracePeriod=2 Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.658312 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.819737 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-utilities\") pod \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.819895 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l25l\" (UniqueName: \"kubernetes.io/projected/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-kube-api-access-2l25l\") pod \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.819987 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-catalog-content\") pod \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\" (UID: \"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c\") " Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.820442 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-utilities" (OuterVolumeSpecName: "utilities") pod "d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" (UID: "d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.824825 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-kube-api-access-2l25l" (OuterVolumeSpecName: "kube-api-access-2l25l") pod "d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" (UID: "d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c"). InnerVolumeSpecName "kube-api-access-2l25l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.832514 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" (UID: "d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.922416 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.922444 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l25l\" (UniqueName: \"kubernetes.io/projected/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-kube-api-access-2l25l\") on node \"crc\" DevicePath \"\"" Sep 30 05:56:56 crc kubenswrapper[4956]: I0930 05:56:56.922455 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.195598 4956 generic.go:334] "Generic (PLEG): container finished" podID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerID="02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11" exitCode=0 Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.195636 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgnws" event={"ID":"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c","Type":"ContainerDied","Data":"02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11"} Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.195667 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgnws" event={"ID":"d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c","Type":"ContainerDied","Data":"3ce9df436491b40250d94fa499f7209ad7805ae955797243a84803cca16026de"} Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.195691 4956 scope.go:117] "RemoveContainer" containerID="02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.195690 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgnws" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.228995 4956 scope.go:117] "RemoveContainer" containerID="21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.249195 4956 scope.go:117] "RemoveContainer" containerID="66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.266274 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgnws"] Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.272366 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgnws"] Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.314669 4956 scope.go:117] "RemoveContainer" containerID="02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11" Sep 30 05:56:57 crc kubenswrapper[4956]: E0930 05:56:57.315320 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11\": container with ID starting with 02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11 not found: ID does not exist" containerID="02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.315372 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11"} err="failed to get container status \"02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11\": rpc error: code = NotFound desc = could not find container \"02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11\": container with ID starting with 02a28699b0f172789009618eff71e175d352c2fb2b219b0361684921db259d11 not found: ID does not exist" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.315436 4956 scope.go:117] "RemoveContainer" containerID="21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc" Sep 30 05:56:57 crc kubenswrapper[4956]: E0930 05:56:57.315878 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc\": container with ID starting with 21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc not found: ID does not exist" containerID="21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.315914 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc"} err="failed to get container status \"21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc\": rpc error: code = NotFound desc = could not find container \"21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc\": container with ID starting with 21eef2775407e8cee8fce3835b5863efb63799d81ca19dfcf149ef6b8b9239dc not found: ID does not exist" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.315938 4956 scope.go:117] "RemoveContainer" containerID="66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61" Sep 30 05:56:57 crc kubenswrapper[4956]: E0930 05:56:57.316460 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61\": container with ID starting with 66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61 not found: ID does not exist" containerID="66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61" Sep 30 05:56:57 crc kubenswrapper[4956]: I0930 05:56:57.316500 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61"} err="failed to get container status \"66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61\": rpc error: code = NotFound desc = could not find container \"66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61\": container with ID starting with 66b29cd15389f4326b00cd1551c357c3dfb31395f93f43bc202a126d6a76bb61 not found: ID does not exist" Sep 30 05:56:58 crc kubenswrapper[4956]: I0930 05:56:58.352584 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" path="/var/lib/kubelet/pods/d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c/volumes" Sep 30 05:57:08 crc kubenswrapper[4956]: I0930 05:57:08.341949 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:57:08 crc kubenswrapper[4956]: E0930 05:57:08.343386 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:57:13 crc kubenswrapper[4956]: I0930 05:57:13.119036 4956 scope.go:117] "RemoveContainer" containerID="3bfbff8657b38fae883e6d49f7723f7336c138325a9cb86b8eb583bcdd980336" Sep 30 05:57:16 crc kubenswrapper[4956]: I0930 05:57:16.035331 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-z68cs"] Sep 30 05:57:16 crc kubenswrapper[4956]: I0930 05:57:16.045681 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-z68cs"] Sep 30 05:57:16 crc kubenswrapper[4956]: I0930 05:57:16.353624 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62ad408-87f6-49ec-a4ff-690217492edd" path="/var/lib/kubelet/pods/a62ad408-87f6-49ec-a4ff-690217492edd/volumes" Sep 30 05:57:18 crc kubenswrapper[4956]: I0930 05:57:18.035190 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kkgnj"] Sep 30 05:57:18 crc kubenswrapper[4956]: I0930 05:57:18.046785 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-whg9s"] Sep 30 05:57:18 crc kubenswrapper[4956]: I0930 05:57:18.056654 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-whg9s"] Sep 30 05:57:18 crc kubenswrapper[4956]: I0930 05:57:18.066416 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kkgnj"] Sep 30 05:57:18 crc kubenswrapper[4956]: I0930 05:57:18.351221 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8c8135-c4bc-40cf-ad92-95995fba0cd0" path="/var/lib/kubelet/pods/4d8c8135-c4bc-40cf-ad92-95995fba0cd0/volumes" Sep 30 05:57:18 crc kubenswrapper[4956]: I0930 05:57:18.351793 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54dd590b-ab6c-4760-83f9-da40bc524570" path="/var/lib/kubelet/pods/54dd590b-ab6c-4760-83f9-da40bc524570/volumes" Sep 30 05:57:21 crc kubenswrapper[4956]: I0930 05:57:21.342138 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:57:21 crc kubenswrapper[4956]: E0930 05:57:21.343751 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:57:26 crc kubenswrapper[4956]: I0930 05:57:26.027317 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e83b-account-create-ljb8m"] Sep 30 05:57:26 crc kubenswrapper[4956]: I0930 05:57:26.034593 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e83b-account-create-ljb8m"] Sep 30 05:57:26 crc kubenswrapper[4956]: I0930 05:57:26.351633 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b6e1d0-5365-4ff6-8d62-63e63a676194" path="/var/lib/kubelet/pods/23b6e1d0-5365-4ff6-8d62-63e63a676194/volumes" Sep 30 05:57:36 crc kubenswrapper[4956]: I0930 05:57:36.341313 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:57:36 crc kubenswrapper[4956]: E0930 05:57:36.342227 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:57:37 crc kubenswrapper[4956]: I0930 05:57:37.027731 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-858b-account-create-bjjkx"] Sep 30 05:57:37 crc kubenswrapper[4956]: I0930 05:57:37.034831 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f1e7-account-create-nx5br"] Sep 30 05:57:37 crc kubenswrapper[4956]: I0930 05:57:37.042250 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f1e7-account-create-nx5br"] Sep 30 05:57:37 crc kubenswrapper[4956]: I0930 05:57:37.048719 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-858b-account-create-bjjkx"] Sep 30 05:57:38 crc kubenswrapper[4956]: I0930 05:57:38.354847 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833452de-c7d3-4b24-939d-13dccbbddadb" path="/var/lib/kubelet/pods/833452de-c7d3-4b24-939d-13dccbbddadb/volumes" Sep 30 05:57:38 crc kubenswrapper[4956]: I0930 05:57:38.355397 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d5954c-8871-4eb9-aa3d-4ec131b12791" path="/var/lib/kubelet/pods/f3d5954c-8871-4eb9-aa3d-4ec131b12791/volumes" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.084690 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q94fq"] Sep 30 05:57:43 crc kubenswrapper[4956]: E0930 05:57:43.085710 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerName="extract-utilities" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.085728 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerName="extract-utilities" Sep 30 05:57:43 crc kubenswrapper[4956]: E0930 05:57:43.085760 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerName="extract-content" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.085769 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerName="extract-content" Sep 30 05:57:43 crc kubenswrapper[4956]: E0930 05:57:43.085811 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerName="registry-server" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.085820 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerName="registry-server" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.086037 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e64ece-c9db-4c3d-ac9e-a9e300f76a5c" containerName="registry-server" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.088186 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.111610 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q94fq"] Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.206032 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-utilities\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.206083 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-catalog-content\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.206836 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnx5g\" (UniqueName: \"kubernetes.io/projected/c9e35067-75fa-4c00-ba5b-9fe5097cc845-kube-api-access-jnx5g\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.309065 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnx5g\" (UniqueName: \"kubernetes.io/projected/c9e35067-75fa-4c00-ba5b-9fe5097cc845-kube-api-access-jnx5g\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.309174 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-utilities\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.309212 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-catalog-content\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.309725 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-utilities\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.309782 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-catalog-content\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.327716 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnx5g\" (UniqueName: \"kubernetes.io/projected/c9e35067-75fa-4c00-ba5b-9fe5097cc845-kube-api-access-jnx5g\") pod \"redhat-operators-q94fq\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.411646 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:43 crc kubenswrapper[4956]: I0930 05:57:43.882976 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q94fq"] Sep 30 05:57:44 crc kubenswrapper[4956]: I0930 05:57:44.666346 4956 generic.go:334] "Generic (PLEG): container finished" podID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerID="65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1" exitCode=0 Sep 30 05:57:44 crc kubenswrapper[4956]: I0930 05:57:44.666402 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q94fq" event={"ID":"c9e35067-75fa-4c00-ba5b-9fe5097cc845","Type":"ContainerDied","Data":"65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1"} Sep 30 05:57:44 crc kubenswrapper[4956]: I0930 05:57:44.666592 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q94fq" event={"ID":"c9e35067-75fa-4c00-ba5b-9fe5097cc845","Type":"ContainerStarted","Data":"f1c2428c2bf7255fc47c37ce207e22503f1834d543ea2538f4b4d2657ce5f4cf"} Sep 30 05:57:45 crc kubenswrapper[4956]: I0930 05:57:45.677747 4956 generic.go:334] "Generic (PLEG): container finished" podID="7577cb07-1bcb-4432-a6e8-f57c1f1b2421" containerID="d2779da15f3f1233557de64d53a8837b74a137453471d97e33ae3d64c9cfaeaf" exitCode=0 Sep 30 05:57:45 crc kubenswrapper[4956]: I0930 05:57:45.677804 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" event={"ID":"7577cb07-1bcb-4432-a6e8-f57c1f1b2421","Type":"ContainerDied","Data":"d2779da15f3f1233557de64d53a8837b74a137453471d97e33ae3d64c9cfaeaf"} Sep 30 05:57:46 crc kubenswrapper[4956]: I0930 05:57:46.691305 4956 generic.go:334] "Generic (PLEG): container finished" podID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerID="9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2" exitCode=0 Sep 30 05:57:46 crc kubenswrapper[4956]: I0930 05:57:46.691427 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q94fq" event={"ID":"c9e35067-75fa-4c00-ba5b-9fe5097cc845","Type":"ContainerDied","Data":"9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2"} Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.129070 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.294072 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxlrq\" (UniqueName: \"kubernetes.io/projected/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-kube-api-access-kxlrq\") pod \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.294329 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-inventory\") pod \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.294387 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-ssh-key\") pod \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\" (UID: \"7577cb07-1bcb-4432-a6e8-f57c1f1b2421\") " Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.340986 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:57:47 crc kubenswrapper[4956]: E0930 05:57:47.341572 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.343470 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-kube-api-access-kxlrq" (OuterVolumeSpecName: "kube-api-access-kxlrq") pod "7577cb07-1bcb-4432-a6e8-f57c1f1b2421" (UID: "7577cb07-1bcb-4432-a6e8-f57c1f1b2421"). InnerVolumeSpecName "kube-api-access-kxlrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.366108 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-inventory" (OuterVolumeSpecName: "inventory") pod "7577cb07-1bcb-4432-a6e8-f57c1f1b2421" (UID: "7577cb07-1bcb-4432-a6e8-f57c1f1b2421"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.381940 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7577cb07-1bcb-4432-a6e8-f57c1f1b2421" (UID: "7577cb07-1bcb-4432-a6e8-f57c1f1b2421"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.397438 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.397482 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.397496 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxlrq\" (UniqueName: \"kubernetes.io/projected/7577cb07-1bcb-4432-a6e8-f57c1f1b2421-kube-api-access-kxlrq\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.702037 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.702025 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm" event={"ID":"7577cb07-1bcb-4432-a6e8-f57c1f1b2421","Type":"ContainerDied","Data":"d5c3875b0bedb5d411970a5d9e0e358ec1e94e37f4286c75ddc536cd7a826938"} Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.702212 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5c3875b0bedb5d411970a5d9e0e358ec1e94e37f4286c75ddc536cd7a826938" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.705671 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q94fq" event={"ID":"c9e35067-75fa-4c00-ba5b-9fe5097cc845","Type":"ContainerStarted","Data":"2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6"} Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.745084 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q94fq" podStartSLOduration=2.304675581 podStartE2EDuration="4.745061921s" podCreationTimestamp="2025-09-30 05:57:43 +0000 UTC" firstStartedPulling="2025-09-30 05:57:44.668697931 +0000 UTC m=+1734.995818456" lastFinishedPulling="2025-09-30 05:57:47.109084271 +0000 UTC m=+1737.436204796" observedRunningTime="2025-09-30 05:57:47.737228505 +0000 UTC m=+1738.064349040" watchObservedRunningTime="2025-09-30 05:57:47.745061921 +0000 UTC m=+1738.072182446" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.812901 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f"] Sep 30 05:57:47 crc kubenswrapper[4956]: E0930 05:57:47.813494 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7577cb07-1bcb-4432-a6e8-f57c1f1b2421" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.813527 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7577cb07-1bcb-4432-a6e8-f57c1f1b2421" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.813750 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7577cb07-1bcb-4432-a6e8-f57c1f1b2421" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.814703 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.817397 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.821345 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.821362 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.835575 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f"] Sep 30 05:57:47 crc kubenswrapper[4956]: I0930 05:57:47.835957 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.010496 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.010708 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.010748 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwb86\" (UniqueName: \"kubernetes.io/projected/8861310e-59f9-47fc-9224-fc01da1aab28-kube-api-access-vwb86\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.112538 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.112677 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.112699 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwb86\" (UniqueName: \"kubernetes.io/projected/8861310e-59f9-47fc-9224-fc01da1aab28-kube-api-access-vwb86\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.117787 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.118697 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.132660 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwb86\" (UniqueName: \"kubernetes.io/projected/8861310e-59f9-47fc-9224-fc01da1aab28-kube-api-access-vwb86\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4j92f\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.184435 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.702163 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f"] Sep 30 05:57:48 crc kubenswrapper[4956]: W0930 05:57:48.704576 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8861310e_59f9_47fc_9224_fc01da1aab28.slice/crio-19703c588207c92d4a849e9219790e171902a66e6377f3b1579dc17916f8acd6 WatchSource:0}: Error finding container 19703c588207c92d4a849e9219790e171902a66e6377f3b1579dc17916f8acd6: Status 404 returned error can't find the container with id 19703c588207c92d4a849e9219790e171902a66e6377f3b1579dc17916f8acd6 Sep 30 05:57:48 crc kubenswrapper[4956]: I0930 05:57:48.716572 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" event={"ID":"8861310e-59f9-47fc-9224-fc01da1aab28","Type":"ContainerStarted","Data":"19703c588207c92d4a849e9219790e171902a66e6377f3b1579dc17916f8acd6"} Sep 30 05:57:49 crc kubenswrapper[4956]: I0930 05:57:49.730243 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" event={"ID":"8861310e-59f9-47fc-9224-fc01da1aab28","Type":"ContainerStarted","Data":"80bd6471337efb428891db67bae2c336929a27620025d5c2435698ab951c1d98"} Sep 30 05:57:49 crc kubenswrapper[4956]: I0930 05:57:49.752775 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" podStartSLOduration=2.275096435 podStartE2EDuration="2.752753958s" podCreationTimestamp="2025-09-30 05:57:47 +0000 UTC" firstStartedPulling="2025-09-30 05:57:48.707475789 +0000 UTC m=+1739.034596304" lastFinishedPulling="2025-09-30 05:57:49.185133302 +0000 UTC m=+1739.512253827" observedRunningTime="2025-09-30 05:57:49.744724736 +0000 UTC m=+1740.071845261" watchObservedRunningTime="2025-09-30 05:57:49.752753958 +0000 UTC m=+1740.079874483" Sep 30 05:57:53 crc kubenswrapper[4956]: I0930 05:57:53.412506 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:53 crc kubenswrapper[4956]: I0930 05:57:53.413015 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:53 crc kubenswrapper[4956]: I0930 05:57:53.457285 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:53 crc kubenswrapper[4956]: I0930 05:57:53.823269 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:53 crc kubenswrapper[4956]: I0930 05:57:53.880504 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q94fq"] Sep 30 05:57:54 crc kubenswrapper[4956]: I0930 05:57:54.776080 4956 generic.go:334] "Generic (PLEG): container finished" podID="8861310e-59f9-47fc-9224-fc01da1aab28" containerID="80bd6471337efb428891db67bae2c336929a27620025d5c2435698ab951c1d98" exitCode=0 Sep 30 05:57:54 crc kubenswrapper[4956]: I0930 05:57:54.776187 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" event={"ID":"8861310e-59f9-47fc-9224-fc01da1aab28","Type":"ContainerDied","Data":"80bd6471337efb428891db67bae2c336929a27620025d5c2435698ab951c1d98"} Sep 30 05:57:55 crc kubenswrapper[4956]: I0930 05:57:55.783215 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q94fq" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerName="registry-server" containerID="cri-o://2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6" gracePeriod=2 Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.169309 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.275033 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-inventory\") pod \"8861310e-59f9-47fc-9224-fc01da1aab28\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.275084 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-ssh-key\") pod \"8861310e-59f9-47fc-9224-fc01da1aab28\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.275157 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwb86\" (UniqueName: \"kubernetes.io/projected/8861310e-59f9-47fc-9224-fc01da1aab28-kube-api-access-vwb86\") pod \"8861310e-59f9-47fc-9224-fc01da1aab28\" (UID: \"8861310e-59f9-47fc-9224-fc01da1aab28\") " Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.283362 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8861310e-59f9-47fc-9224-fc01da1aab28-kube-api-access-vwb86" (OuterVolumeSpecName: "kube-api-access-vwb86") pod "8861310e-59f9-47fc-9224-fc01da1aab28" (UID: "8861310e-59f9-47fc-9224-fc01da1aab28"). InnerVolumeSpecName "kube-api-access-vwb86". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.302654 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-inventory" (OuterVolumeSpecName: "inventory") pod "8861310e-59f9-47fc-9224-fc01da1aab28" (UID: "8861310e-59f9-47fc-9224-fc01da1aab28"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.306714 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8861310e-59f9-47fc-9224-fc01da1aab28" (UID: "8861310e-59f9-47fc-9224-fc01da1aab28"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.377150 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.377181 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8861310e-59f9-47fc-9224-fc01da1aab28-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.377192 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwb86\" (UniqueName: \"kubernetes.io/projected/8861310e-59f9-47fc-9224-fc01da1aab28-kube-api-access-vwb86\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.685465 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.784678 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-utilities\") pod \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.784733 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnx5g\" (UniqueName: \"kubernetes.io/projected/c9e35067-75fa-4c00-ba5b-9fe5097cc845-kube-api-access-jnx5g\") pod \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.784838 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-catalog-content\") pod \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\" (UID: \"c9e35067-75fa-4c00-ba5b-9fe5097cc845\") " Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.785570 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-utilities" (OuterVolumeSpecName: "utilities") pod "c9e35067-75fa-4c00-ba5b-9fe5097cc845" (UID: "c9e35067-75fa-4c00-ba5b-9fe5097cc845"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.785727 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.787927 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e35067-75fa-4c00-ba5b-9fe5097cc845-kube-api-access-jnx5g" (OuterVolumeSpecName: "kube-api-access-jnx5g") pod "c9e35067-75fa-4c00-ba5b-9fe5097cc845" (UID: "c9e35067-75fa-4c00-ba5b-9fe5097cc845"). InnerVolumeSpecName "kube-api-access-jnx5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.796637 4956 generic.go:334] "Generic (PLEG): container finished" podID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerID="2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6" exitCode=0 Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.796736 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q94fq" event={"ID":"c9e35067-75fa-4c00-ba5b-9fe5097cc845","Type":"ContainerDied","Data":"2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6"} Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.796759 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q94fq" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.796776 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q94fq" event={"ID":"c9e35067-75fa-4c00-ba5b-9fe5097cc845","Type":"ContainerDied","Data":"f1c2428c2bf7255fc47c37ce207e22503f1834d543ea2538f4b4d2657ce5f4cf"} Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.796804 4956 scope.go:117] "RemoveContainer" containerID="2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.799987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" event={"ID":"8861310e-59f9-47fc-9224-fc01da1aab28","Type":"ContainerDied","Data":"19703c588207c92d4a849e9219790e171902a66e6377f3b1579dc17916f8acd6"} Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.800031 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19703c588207c92d4a849e9219790e171902a66e6377f3b1579dc17916f8acd6" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.800006 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4j92f" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.836038 4956 scope.go:117] "RemoveContainer" containerID="9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.872838 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7"] Sep 30 05:57:56 crc kubenswrapper[4956]: E0930 05:57:56.873610 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerName="extract-utilities" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.873650 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerName="extract-utilities" Sep 30 05:57:56 crc kubenswrapper[4956]: E0930 05:57:56.873703 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8861310e-59f9-47fc-9224-fc01da1aab28" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.873718 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8861310e-59f9-47fc-9224-fc01da1aab28" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 05:57:56 crc kubenswrapper[4956]: E0930 05:57:56.873741 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerName="registry-server" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.873754 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerName="registry-server" Sep 30 05:57:56 crc kubenswrapper[4956]: E0930 05:57:56.873772 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerName="extract-content" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.873783 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerName="extract-content" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.874168 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8861310e-59f9-47fc-9224-fc01da1aab28" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.874213 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" containerName="registry-server" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.875295 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.877369 4956 scope.go:117] "RemoveContainer" containerID="65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.879807 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.880106 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.881216 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.884590 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.885086 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7"] Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.887989 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnx5g\" (UniqueName: \"kubernetes.io/projected/c9e35067-75fa-4c00-ba5b-9fe5097cc845-kube-api-access-jnx5g\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.907205 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9e35067-75fa-4c00-ba5b-9fe5097cc845" (UID: "c9e35067-75fa-4c00-ba5b-9fe5097cc845"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.933269 4956 scope.go:117] "RemoveContainer" containerID="2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6" Sep 30 05:57:56 crc kubenswrapper[4956]: E0930 05:57:56.934820 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6\": container with ID starting with 2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6 not found: ID does not exist" containerID="2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.934854 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6"} err="failed to get container status \"2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6\": rpc error: code = NotFound desc = could not find container \"2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6\": container with ID starting with 2a9d9afcb634d83fd4732aa3a9f638cf87b1263aba4b6e4b8cdc6d433cc1dbf6 not found: ID does not exist" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.934876 4956 scope.go:117] "RemoveContainer" containerID="9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2" Sep 30 05:57:56 crc kubenswrapper[4956]: E0930 05:57:56.936218 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2\": container with ID starting with 9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2 not found: ID does not exist" containerID="9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.936253 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2"} err="failed to get container status \"9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2\": rpc error: code = NotFound desc = could not find container \"9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2\": container with ID starting with 9cfc0c62b977a8b32bfb1547359164fbd12bf783fd6204e85291a181a43a77d2 not found: ID does not exist" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.936277 4956 scope.go:117] "RemoveContainer" containerID="65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1" Sep 30 05:57:56 crc kubenswrapper[4956]: E0930 05:57:56.938147 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1\": container with ID starting with 65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1 not found: ID does not exist" containerID="65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.938183 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1"} err="failed to get container status \"65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1\": rpc error: code = NotFound desc = could not find container \"65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1\": container with ID starting with 65fd4815988d22ae620f126e9c018d68437238da05f6a56087112e3eb121f5a1 not found: ID does not exist" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.989210 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.989369 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.989429 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nrxf\" (UniqueName: \"kubernetes.io/projected/cc76e3e2-ccde-4622-bcd2-fc347ee16271-kube-api-access-8nrxf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:56 crc kubenswrapper[4956]: I0930 05:57:56.989501 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e35067-75fa-4c00-ba5b-9fe5097cc845-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.090959 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.091077 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nrxf\" (UniqueName: \"kubernetes.io/projected/cc76e3e2-ccde-4622-bcd2-fc347ee16271-kube-api-access-8nrxf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.091171 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.094660 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.096553 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.109437 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nrxf\" (UniqueName: \"kubernetes.io/projected/cc76e3e2-ccde-4622-bcd2-fc347ee16271-kube-api-access-8nrxf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tnwj7\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.224168 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q94fq"] Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.231060 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q94fq"] Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.306412 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:57:57 crc kubenswrapper[4956]: I0930 05:57:57.843438 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7"] Sep 30 05:57:58 crc kubenswrapper[4956]: I0930 05:57:58.357968 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e35067-75fa-4c00-ba5b-9fe5097cc845" path="/var/lib/kubelet/pods/c9e35067-75fa-4c00-ba5b-9fe5097cc845/volumes" Sep 30 05:57:58 crc kubenswrapper[4956]: I0930 05:57:58.820701 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" event={"ID":"cc76e3e2-ccde-4622-bcd2-fc347ee16271","Type":"ContainerStarted","Data":"419ff0b490ae7fd32b10844a846b717b09b5a132b8772d5de277acc654f13342"} Sep 30 05:57:58 crc kubenswrapper[4956]: I0930 05:57:58.820744 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" event={"ID":"cc76e3e2-ccde-4622-bcd2-fc347ee16271","Type":"ContainerStarted","Data":"aaab18566a5751e901c7ba7bd147a76b4213e7a3bb1ee59fef26fd8661f9bd37"} Sep 30 05:57:58 crc kubenswrapper[4956]: I0930 05:57:58.848258 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" podStartSLOduration=2.393849862 podStartE2EDuration="2.848239885s" podCreationTimestamp="2025-09-30 05:57:56 +0000 UTC" firstStartedPulling="2025-09-30 05:57:57.840451173 +0000 UTC m=+1748.167571698" lastFinishedPulling="2025-09-30 05:57:58.294841196 +0000 UTC m=+1748.621961721" observedRunningTime="2025-09-30 05:57:58.840533943 +0000 UTC m=+1749.167654468" watchObservedRunningTime="2025-09-30 05:57:58.848239885 +0000 UTC m=+1749.175360410" Sep 30 05:57:59 crc kubenswrapper[4956]: I0930 05:57:59.341306 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:57:59 crc kubenswrapper[4956]: E0930 05:57:59.341962 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:58:01 crc kubenswrapper[4956]: I0930 05:58:01.040253 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s77tj"] Sep 30 05:58:01 crc kubenswrapper[4956]: I0930 05:58:01.049006 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-s77tj"] Sep 30 05:58:02 crc kubenswrapper[4956]: I0930 05:58:02.354350 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049f3cac-fecc-4e1f-adcf-95b6c9202221" path="/var/lib/kubelet/pods/049f3cac-fecc-4e1f-adcf-95b6c9202221/volumes" Sep 30 05:58:10 crc kubenswrapper[4956]: I0930 05:58:10.346662 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:58:10 crc kubenswrapper[4956]: E0930 05:58:10.347457 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 05:58:13 crc kubenswrapper[4956]: I0930 05:58:13.201894 4956 scope.go:117] "RemoveContainer" containerID="93663a57af4168398e0cd80b1148fa6421e0206ef579b26802d61d49f44138e9" Sep 30 05:58:13 crc kubenswrapper[4956]: I0930 05:58:13.235951 4956 scope.go:117] "RemoveContainer" containerID="d6fc9d8af7e549440a093bfdd6758f6a47a88026851e093bc423bffed4754901" Sep 30 05:58:13 crc kubenswrapper[4956]: I0930 05:58:13.287193 4956 scope.go:117] "RemoveContainer" containerID="ad980e1cc3af3d725cb7ebc8008d86cfe4f4ce1e9d706d0b5e57b0be5200b678" Sep 30 05:58:13 crc kubenswrapper[4956]: I0930 05:58:13.347778 4956 scope.go:117] "RemoveContainer" containerID="5ff7491d10e34d87f1d8b7f2e0977c335b771a49054273a019d0ee97d111d953" Sep 30 05:58:13 crc kubenswrapper[4956]: I0930 05:58:13.392227 4956 scope.go:117] "RemoveContainer" containerID="07ff28d47764f596ce31b1eb3eb156fbc57d9c27c967fbd11cc84e27f711a6da" Sep 30 05:58:13 crc kubenswrapper[4956]: I0930 05:58:13.454724 4956 scope.go:117] "RemoveContainer" containerID="e6cc2a14e90000a60308ba8952958484dc0ca122a291d3702de25f99317c76c8" Sep 30 05:58:13 crc kubenswrapper[4956]: I0930 05:58:13.490492 4956 scope.go:117] "RemoveContainer" containerID="fd19bf5931c1a9956c3ed67328d1a107bf12d01ddfa8d24940a96d44f2deb2b2" Sep 30 05:58:24 crc kubenswrapper[4956]: I0930 05:58:24.340983 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 05:58:25 crc kubenswrapper[4956]: I0930 05:58:25.069590 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"0dfc54b32590324bdbf7360f23fe87e3bb3d8dad1586a2b8c737285b6be9a13a"} Sep 30 05:58:26 crc kubenswrapper[4956]: I0930 05:58:26.056102 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlrc"] Sep 30 05:58:26 crc kubenswrapper[4956]: I0930 05:58:26.065688 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xzlrc"] Sep 30 05:58:26 crc kubenswrapper[4956]: I0930 05:58:26.350935 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89d1f309-f110-45e2-ae88-8043eca7553e" path="/var/lib/kubelet/pods/89d1f309-f110-45e2-ae88-8043eca7553e/volumes" Sep 30 05:58:29 crc kubenswrapper[4956]: I0930 05:58:29.044329 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kb9pc"] Sep 30 05:58:29 crc kubenswrapper[4956]: I0930 05:58:29.061086 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kb9pc"] Sep 30 05:58:30 crc kubenswrapper[4956]: I0930 05:58:30.354938 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1691e854-b1af-4974-b06a-716b66af43b4" path="/var/lib/kubelet/pods/1691e854-b1af-4974-b06a-716b66af43b4/volumes" Sep 30 05:58:41 crc kubenswrapper[4956]: I0930 05:58:41.213956 4956 generic.go:334] "Generic (PLEG): container finished" podID="cc76e3e2-ccde-4622-bcd2-fc347ee16271" containerID="419ff0b490ae7fd32b10844a846b717b09b5a132b8772d5de277acc654f13342" exitCode=0 Sep 30 05:58:41 crc kubenswrapper[4956]: I0930 05:58:41.214063 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" event={"ID":"cc76e3e2-ccde-4622-bcd2-fc347ee16271","Type":"ContainerDied","Data":"419ff0b490ae7fd32b10844a846b717b09b5a132b8772d5de277acc654f13342"} Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.687011 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.760600 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nrxf\" (UniqueName: \"kubernetes.io/projected/cc76e3e2-ccde-4622-bcd2-fc347ee16271-kube-api-access-8nrxf\") pod \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.760663 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-inventory\") pod \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.760932 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-ssh-key\") pod \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\" (UID: \"cc76e3e2-ccde-4622-bcd2-fc347ee16271\") " Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.766378 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc76e3e2-ccde-4622-bcd2-fc347ee16271-kube-api-access-8nrxf" (OuterVolumeSpecName: "kube-api-access-8nrxf") pod "cc76e3e2-ccde-4622-bcd2-fc347ee16271" (UID: "cc76e3e2-ccde-4622-bcd2-fc347ee16271"). InnerVolumeSpecName "kube-api-access-8nrxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.789895 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cc76e3e2-ccde-4622-bcd2-fc347ee16271" (UID: "cc76e3e2-ccde-4622-bcd2-fc347ee16271"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.801220 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-inventory" (OuterVolumeSpecName: "inventory") pod "cc76e3e2-ccde-4622-bcd2-fc347ee16271" (UID: "cc76e3e2-ccde-4622-bcd2-fc347ee16271"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.862832 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.862863 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nrxf\" (UniqueName: \"kubernetes.io/projected/cc76e3e2-ccde-4622-bcd2-fc347ee16271-kube-api-access-8nrxf\") on node \"crc\" DevicePath \"\"" Sep 30 05:58:42 crc kubenswrapper[4956]: I0930 05:58:42.862874 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc76e3e2-ccde-4622-bcd2-fc347ee16271-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.241495 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" event={"ID":"cc76e3e2-ccde-4622-bcd2-fc347ee16271","Type":"ContainerDied","Data":"aaab18566a5751e901c7ba7bd147a76b4213e7a3bb1ee59fef26fd8661f9bd37"} Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.241535 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaab18566a5751e901c7ba7bd147a76b4213e7a3bb1ee59fef26fd8661f9bd37" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.241555 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tnwj7" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.341887 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d"] Sep 30 05:58:43 crc kubenswrapper[4956]: E0930 05:58:43.342592 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc76e3e2-ccde-4622-bcd2-fc347ee16271" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.342622 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc76e3e2-ccde-4622-bcd2-fc347ee16271" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.342947 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc76e3e2-ccde-4622-bcd2-fc347ee16271" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.343901 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.346929 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.348485 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.349048 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.349205 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.359948 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d"] Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.396617 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.396697 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.397293 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5ngg\" (UniqueName: \"kubernetes.io/projected/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-kube-api-access-v5ngg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.499258 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5ngg\" (UniqueName: \"kubernetes.io/projected/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-kube-api-access-v5ngg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.499341 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.499382 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.509631 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.509662 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.522959 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5ngg\" (UniqueName: \"kubernetes.io/projected/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-kube-api-access-v5ngg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:43 crc kubenswrapper[4956]: I0930 05:58:43.709353 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:58:44 crc kubenswrapper[4956]: I0930 05:58:44.249077 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d"] Sep 30 05:58:45 crc kubenswrapper[4956]: I0930 05:58:45.259734 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" event={"ID":"ec470ec9-3b1f-409d-ab54-44bb44daf1fe","Type":"ContainerStarted","Data":"182c4d57cb4ed46953f0495d095d72c29e4479dcec4a66c24f9118a9f29f6b0c"} Sep 30 05:58:45 crc kubenswrapper[4956]: I0930 05:58:45.260207 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" event={"ID":"ec470ec9-3b1f-409d-ab54-44bb44daf1fe","Type":"ContainerStarted","Data":"26d4b23769d5faecdf168549a13e9d8c3bf06156325c6c23fed8d1d57950e660"} Sep 30 05:58:45 crc kubenswrapper[4956]: I0930 05:58:45.279336 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" podStartSLOduration=1.86956255 podStartE2EDuration="2.279318144s" podCreationTimestamp="2025-09-30 05:58:43 +0000 UTC" firstStartedPulling="2025-09-30 05:58:44.258893075 +0000 UTC m=+1794.586013600" lastFinishedPulling="2025-09-30 05:58:44.668648669 +0000 UTC m=+1794.995769194" observedRunningTime="2025-09-30 05:58:45.27632827 +0000 UTC m=+1795.603448855" watchObservedRunningTime="2025-09-30 05:58:45.279318144 +0000 UTC m=+1795.606438669" Sep 30 05:59:10 crc kubenswrapper[4956]: I0930 05:59:10.050863 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vvw9m"] Sep 30 05:59:10 crc kubenswrapper[4956]: I0930 05:59:10.060577 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vvw9m"] Sep 30 05:59:10 crc kubenswrapper[4956]: I0930 05:59:10.352695 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8" path="/var/lib/kubelet/pods/6a7a0b4a-2f92-4af7-a301-b31e1fbe39a8/volumes" Sep 30 05:59:13 crc kubenswrapper[4956]: I0930 05:59:13.648587 4956 scope.go:117] "RemoveContainer" containerID="d5cc994d3c74d52bfdb0e5c3cb7fcae96c1530c1f2b43189ac8f95df6cc1dae4" Sep 30 05:59:13 crc kubenswrapper[4956]: I0930 05:59:13.693334 4956 scope.go:117] "RemoveContainer" containerID="c74d0674d1bfa2eb3362b4e92462a8e0c95593532f441c16280005e4f0ad7e96" Sep 30 05:59:13 crc kubenswrapper[4956]: I0930 05:59:13.744338 4956 scope.go:117] "RemoveContainer" containerID="5d866abb3a48b8d48a5d8a1713bfc515487fe6f85e8335d457c23d46a78a093a" Sep 30 05:59:43 crc kubenswrapper[4956]: I0930 05:59:43.824616 4956 generic.go:334] "Generic (PLEG): container finished" podID="ec470ec9-3b1f-409d-ab54-44bb44daf1fe" containerID="182c4d57cb4ed46953f0495d095d72c29e4479dcec4a66c24f9118a9f29f6b0c" exitCode=0 Sep 30 05:59:43 crc kubenswrapper[4956]: I0930 05:59:43.824704 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" event={"ID":"ec470ec9-3b1f-409d-ab54-44bb44daf1fe","Type":"ContainerDied","Data":"182c4d57cb4ed46953f0495d095d72c29e4479dcec4a66c24f9118a9f29f6b0c"} Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.212746 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.308872 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5ngg\" (UniqueName: \"kubernetes.io/projected/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-kube-api-access-v5ngg\") pod \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.308931 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-inventory\") pod \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.309254 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-ssh-key\") pod \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\" (UID: \"ec470ec9-3b1f-409d-ab54-44bb44daf1fe\") " Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.318146 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-kube-api-access-v5ngg" (OuterVolumeSpecName: "kube-api-access-v5ngg") pod "ec470ec9-3b1f-409d-ab54-44bb44daf1fe" (UID: "ec470ec9-3b1f-409d-ab54-44bb44daf1fe"). InnerVolumeSpecName "kube-api-access-v5ngg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.342396 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-inventory" (OuterVolumeSpecName: "inventory") pod "ec470ec9-3b1f-409d-ab54-44bb44daf1fe" (UID: "ec470ec9-3b1f-409d-ab54-44bb44daf1fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.342557 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ec470ec9-3b1f-409d-ab54-44bb44daf1fe" (UID: "ec470ec9-3b1f-409d-ab54-44bb44daf1fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.412436 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.412482 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5ngg\" (UniqueName: \"kubernetes.io/projected/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-kube-api-access-v5ngg\") on node \"crc\" DevicePath \"\"" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.412502 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec470ec9-3b1f-409d-ab54-44bb44daf1fe-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.843322 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" event={"ID":"ec470ec9-3b1f-409d-ab54-44bb44daf1fe","Type":"ContainerDied","Data":"26d4b23769d5faecdf168549a13e9d8c3bf06156325c6c23fed8d1d57950e660"} Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.843365 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26d4b23769d5faecdf168549a13e9d8c3bf06156325c6c23fed8d1d57950e660" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.843451 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.932975 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d8mjv"] Sep 30 05:59:45 crc kubenswrapper[4956]: E0930 05:59:45.933692 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec470ec9-3b1f-409d-ab54-44bb44daf1fe" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.933717 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec470ec9-3b1f-409d-ab54-44bb44daf1fe" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.933924 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec470ec9-3b1f-409d-ab54-44bb44daf1fe" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.934799 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.937975 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.938191 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.938406 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.938476 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:59:45 crc kubenswrapper[4956]: I0930 05:59:45.945491 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d8mjv"] Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.022294 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.022744 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv8mt\" (UniqueName: \"kubernetes.io/projected/9ad04c2f-1706-4b52-9267-613f86dc0388-kube-api-access-zv8mt\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.022804 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.124524 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.124685 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv8mt\" (UniqueName: \"kubernetes.io/projected/9ad04c2f-1706-4b52-9267-613f86dc0388-kube-api-access-zv8mt\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.124713 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.133769 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.133977 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.150170 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv8mt\" (UniqueName: \"kubernetes.io/projected/9ad04c2f-1706-4b52-9267-613f86dc0388-kube-api-access-zv8mt\") pod \"ssh-known-hosts-edpm-deployment-d8mjv\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.261756 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.798547 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d8mjv"] Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.811599 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 05:59:46 crc kubenswrapper[4956]: I0930 05:59:46.856689 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" event={"ID":"9ad04c2f-1706-4b52-9267-613f86dc0388","Type":"ContainerStarted","Data":"d57d45dde86afbbc12a315f4631db5c84f26c7d91d4f6e9dfdc545f026622252"} Sep 30 05:59:47 crc kubenswrapper[4956]: I0930 05:59:47.867274 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" event={"ID":"9ad04c2f-1706-4b52-9267-613f86dc0388","Type":"ContainerStarted","Data":"923dba12cfd3fdaf8c22a057260703691419c6edd4289c8c35dea84f1bae71f2"} Sep 30 05:59:47 crc kubenswrapper[4956]: I0930 05:59:47.888303 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" podStartSLOduration=2.375986844 podStartE2EDuration="2.888285264s" podCreationTimestamp="2025-09-30 05:59:45 +0000 UTC" firstStartedPulling="2025-09-30 05:59:46.811342143 +0000 UTC m=+1857.138462668" lastFinishedPulling="2025-09-30 05:59:47.323640553 +0000 UTC m=+1857.650761088" observedRunningTime="2025-09-30 05:59:47.882969108 +0000 UTC m=+1858.210089673" watchObservedRunningTime="2025-09-30 05:59:47.888285264 +0000 UTC m=+1858.215405789" Sep 30 05:59:55 crc kubenswrapper[4956]: I0930 05:59:55.953482 4956 generic.go:334] "Generic (PLEG): container finished" podID="9ad04c2f-1706-4b52-9267-613f86dc0388" containerID="923dba12cfd3fdaf8c22a057260703691419c6edd4289c8c35dea84f1bae71f2" exitCode=0 Sep 30 05:59:55 crc kubenswrapper[4956]: I0930 05:59:55.953548 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" event={"ID":"9ad04c2f-1706-4b52-9267-613f86dc0388","Type":"ContainerDied","Data":"923dba12cfd3fdaf8c22a057260703691419c6edd4289c8c35dea84f1bae71f2"} Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.421930 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.549042 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-inventory-0\") pod \"9ad04c2f-1706-4b52-9267-613f86dc0388\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.549101 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv8mt\" (UniqueName: \"kubernetes.io/projected/9ad04c2f-1706-4b52-9267-613f86dc0388-kube-api-access-zv8mt\") pod \"9ad04c2f-1706-4b52-9267-613f86dc0388\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.549154 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-ssh-key-openstack-edpm-ipam\") pod \"9ad04c2f-1706-4b52-9267-613f86dc0388\" (UID: \"9ad04c2f-1706-4b52-9267-613f86dc0388\") " Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.555367 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad04c2f-1706-4b52-9267-613f86dc0388-kube-api-access-zv8mt" (OuterVolumeSpecName: "kube-api-access-zv8mt") pod "9ad04c2f-1706-4b52-9267-613f86dc0388" (UID: "9ad04c2f-1706-4b52-9267-613f86dc0388"). InnerVolumeSpecName "kube-api-access-zv8mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.576548 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ad04c2f-1706-4b52-9267-613f86dc0388" (UID: "9ad04c2f-1706-4b52-9267-613f86dc0388"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.577485 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "9ad04c2f-1706-4b52-9267-613f86dc0388" (UID: "9ad04c2f-1706-4b52-9267-613f86dc0388"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.651219 4956 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.651255 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv8mt\" (UniqueName: \"kubernetes.io/projected/9ad04c2f-1706-4b52-9267-613f86dc0388-kube-api-access-zv8mt\") on node \"crc\" DevicePath \"\"" Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.651272 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ad04c2f-1706-4b52-9267-613f86dc0388-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.993808 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" event={"ID":"9ad04c2f-1706-4b52-9267-613f86dc0388","Type":"ContainerDied","Data":"d57d45dde86afbbc12a315f4631db5c84f26c7d91d4f6e9dfdc545f026622252"} Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.993851 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57d45dde86afbbc12a315f4631db5c84f26c7d91d4f6e9dfdc545f026622252" Sep 30 05:59:57 crc kubenswrapper[4956]: I0930 05:59:57.993919 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d8mjv" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.065673 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7"] Sep 30 05:59:58 crc kubenswrapper[4956]: E0930 05:59:58.066510 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad04c2f-1706-4b52-9267-613f86dc0388" containerName="ssh-known-hosts-edpm-deployment" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.066614 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad04c2f-1706-4b52-9267-613f86dc0388" containerName="ssh-known-hosts-edpm-deployment" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.066984 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad04c2f-1706-4b52-9267-613f86dc0388" containerName="ssh-known-hosts-edpm-deployment" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.068002 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.069771 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.071059 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.072059 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.072175 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.100043 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7"] Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.161049 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjl9b\" (UniqueName: \"kubernetes.io/projected/fd6826cb-1434-4917-a836-ae952394b1ca-kube-api-access-jjl9b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.161496 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.161570 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.263403 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.263442 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.263534 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjl9b\" (UniqueName: \"kubernetes.io/projected/fd6826cb-1434-4917-a836-ae952394b1ca-kube-api-access-jjl9b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.266971 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.270840 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.282625 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjl9b\" (UniqueName: \"kubernetes.io/projected/fd6826cb-1434-4917-a836-ae952394b1ca-kube-api-access-jjl9b\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-785q7\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.402851 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 05:59:58 crc kubenswrapper[4956]: I0930 05:59:58.927140 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7"] Sep 30 05:59:59 crc kubenswrapper[4956]: I0930 05:59:59.003945 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" event={"ID":"fd6826cb-1434-4917-a836-ae952394b1ca","Type":"ContainerStarted","Data":"1a279d22b966e4ed8044303fa52f270cb1d44d7da3a414df61a7cec6a87dd667"} Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.014084 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" event={"ID":"fd6826cb-1434-4917-a836-ae952394b1ca","Type":"ContainerStarted","Data":"9350ed7748535ea9be9a5026f896847e428f7e3603647de6465ba257b2d0fc61"} Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.044404 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" podStartSLOduration=1.5831078170000001 podStartE2EDuration="2.044379876s" podCreationTimestamp="2025-09-30 05:59:58 +0000 UTC" firstStartedPulling="2025-09-30 05:59:58.939747486 +0000 UTC m=+1869.266868021" lastFinishedPulling="2025-09-30 05:59:59.401019555 +0000 UTC m=+1869.728140080" observedRunningTime="2025-09-30 06:00:00.030174661 +0000 UTC m=+1870.357295216" watchObservedRunningTime="2025-09-30 06:00:00.044379876 +0000 UTC m=+1870.371500411" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.130352 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8"] Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.131550 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.134843 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.136566 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.145651 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8"] Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.300684 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/719ab0a4-b97f-412a-8b05-52ae64f6d995-config-volume\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.300753 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/719ab0a4-b97f-412a-8b05-52ae64f6d995-secret-volume\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.300824 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6m2c\" (UniqueName: \"kubernetes.io/projected/719ab0a4-b97f-412a-8b05-52ae64f6d995-kube-api-access-m6m2c\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.402595 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/719ab0a4-b97f-412a-8b05-52ae64f6d995-config-volume\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.402684 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/719ab0a4-b97f-412a-8b05-52ae64f6d995-secret-volume\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.402769 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6m2c\" (UniqueName: \"kubernetes.io/projected/719ab0a4-b97f-412a-8b05-52ae64f6d995-kube-api-access-m6m2c\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.403718 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/719ab0a4-b97f-412a-8b05-52ae64f6d995-config-volume\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.418240 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6m2c\" (UniqueName: \"kubernetes.io/projected/719ab0a4-b97f-412a-8b05-52ae64f6d995-kube-api-access-m6m2c\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.426749 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/719ab0a4-b97f-412a-8b05-52ae64f6d995-secret-volume\") pod \"collect-profiles-29320200-rc9c8\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.453148 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:00 crc kubenswrapper[4956]: I0930 06:00:00.914223 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8"] Sep 30 06:00:00 crc kubenswrapper[4956]: W0930 06:00:00.921263 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod719ab0a4_b97f_412a_8b05_52ae64f6d995.slice/crio-d4d6eaaa9636d3fdd185419d5486f3f52d4067394ce9be54af6acfecdfb18c1a WatchSource:0}: Error finding container d4d6eaaa9636d3fdd185419d5486f3f52d4067394ce9be54af6acfecdfb18c1a: Status 404 returned error can't find the container with id d4d6eaaa9636d3fdd185419d5486f3f52d4067394ce9be54af6acfecdfb18c1a Sep 30 06:00:01 crc kubenswrapper[4956]: I0930 06:00:01.022776 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" event={"ID":"719ab0a4-b97f-412a-8b05-52ae64f6d995","Type":"ContainerStarted","Data":"d4d6eaaa9636d3fdd185419d5486f3f52d4067394ce9be54af6acfecdfb18c1a"} Sep 30 06:00:02 crc kubenswrapper[4956]: I0930 06:00:02.033577 4956 generic.go:334] "Generic (PLEG): container finished" podID="719ab0a4-b97f-412a-8b05-52ae64f6d995" containerID="a947d57ffef82fb533888f469c314643f902dcaf76d7a1676b840d17cbab3197" exitCode=0 Sep 30 06:00:02 crc kubenswrapper[4956]: I0930 06:00:02.033679 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" event={"ID":"719ab0a4-b97f-412a-8b05-52ae64f6d995","Type":"ContainerDied","Data":"a947d57ffef82fb533888f469c314643f902dcaf76d7a1676b840d17cbab3197"} Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.378229 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.476376 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6m2c\" (UniqueName: \"kubernetes.io/projected/719ab0a4-b97f-412a-8b05-52ae64f6d995-kube-api-access-m6m2c\") pod \"719ab0a4-b97f-412a-8b05-52ae64f6d995\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.476440 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/719ab0a4-b97f-412a-8b05-52ae64f6d995-secret-volume\") pod \"719ab0a4-b97f-412a-8b05-52ae64f6d995\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.476724 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/719ab0a4-b97f-412a-8b05-52ae64f6d995-config-volume\") pod \"719ab0a4-b97f-412a-8b05-52ae64f6d995\" (UID: \"719ab0a4-b97f-412a-8b05-52ae64f6d995\") " Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.477319 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719ab0a4-b97f-412a-8b05-52ae64f6d995-config-volume" (OuterVolumeSpecName: "config-volume") pod "719ab0a4-b97f-412a-8b05-52ae64f6d995" (UID: "719ab0a4-b97f-412a-8b05-52ae64f6d995"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.482308 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719ab0a4-b97f-412a-8b05-52ae64f6d995-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "719ab0a4-b97f-412a-8b05-52ae64f6d995" (UID: "719ab0a4-b97f-412a-8b05-52ae64f6d995"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.482581 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719ab0a4-b97f-412a-8b05-52ae64f6d995-kube-api-access-m6m2c" (OuterVolumeSpecName: "kube-api-access-m6m2c") pod "719ab0a4-b97f-412a-8b05-52ae64f6d995" (UID: "719ab0a4-b97f-412a-8b05-52ae64f6d995"). InnerVolumeSpecName "kube-api-access-m6m2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.578266 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6m2c\" (UniqueName: \"kubernetes.io/projected/719ab0a4-b97f-412a-8b05-52ae64f6d995-kube-api-access-m6m2c\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.578296 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/719ab0a4-b97f-412a-8b05-52ae64f6d995-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:03 crc kubenswrapper[4956]: I0930 06:00:03.578305 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/719ab0a4-b97f-412a-8b05-52ae64f6d995-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:04 crc kubenswrapper[4956]: I0930 06:00:04.057236 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" event={"ID":"719ab0a4-b97f-412a-8b05-52ae64f6d995","Type":"ContainerDied","Data":"d4d6eaaa9636d3fdd185419d5486f3f52d4067394ce9be54af6acfecdfb18c1a"} Sep 30 06:00:04 crc kubenswrapper[4956]: I0930 06:00:04.057492 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4d6eaaa9636d3fdd185419d5486f3f52d4067394ce9be54af6acfecdfb18c1a" Sep 30 06:00:04 crc kubenswrapper[4956]: I0930 06:00:04.057307 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8" Sep 30 06:00:09 crc kubenswrapper[4956]: I0930 06:00:09.104646 4956 generic.go:334] "Generic (PLEG): container finished" podID="fd6826cb-1434-4917-a836-ae952394b1ca" containerID="9350ed7748535ea9be9a5026f896847e428f7e3603647de6465ba257b2d0fc61" exitCode=0 Sep 30 06:00:09 crc kubenswrapper[4956]: I0930 06:00:09.104918 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" event={"ID":"fd6826cb-1434-4917-a836-ae952394b1ca","Type":"ContainerDied","Data":"9350ed7748535ea9be9a5026f896847e428f7e3603647de6465ba257b2d0fc61"} Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.574279 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.733266 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-ssh-key\") pod \"fd6826cb-1434-4917-a836-ae952394b1ca\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.733583 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-inventory\") pod \"fd6826cb-1434-4917-a836-ae952394b1ca\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.733634 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjl9b\" (UniqueName: \"kubernetes.io/projected/fd6826cb-1434-4917-a836-ae952394b1ca-kube-api-access-jjl9b\") pod \"fd6826cb-1434-4917-a836-ae952394b1ca\" (UID: \"fd6826cb-1434-4917-a836-ae952394b1ca\") " Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.742726 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6826cb-1434-4917-a836-ae952394b1ca-kube-api-access-jjl9b" (OuterVolumeSpecName: "kube-api-access-jjl9b") pod "fd6826cb-1434-4917-a836-ae952394b1ca" (UID: "fd6826cb-1434-4917-a836-ae952394b1ca"). InnerVolumeSpecName "kube-api-access-jjl9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.762557 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fd6826cb-1434-4917-a836-ae952394b1ca" (UID: "fd6826cb-1434-4917-a836-ae952394b1ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.764592 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-inventory" (OuterVolumeSpecName: "inventory") pod "fd6826cb-1434-4917-a836-ae952394b1ca" (UID: "fd6826cb-1434-4917-a836-ae952394b1ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.836226 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.836263 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd6826cb-1434-4917-a836-ae952394b1ca-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:10 crc kubenswrapper[4956]: I0930 06:00:10.836274 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjl9b\" (UniqueName: \"kubernetes.io/projected/fd6826cb-1434-4917-a836-ae952394b1ca-kube-api-access-jjl9b\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.123353 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" event={"ID":"fd6826cb-1434-4917-a836-ae952394b1ca","Type":"ContainerDied","Data":"1a279d22b966e4ed8044303fa52f270cb1d44d7da3a414df61a7cec6a87dd667"} Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.123702 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a279d22b966e4ed8044303fa52f270cb1d44d7da3a414df61a7cec6a87dd667" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.123438 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-785q7" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.198439 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk"] Sep 30 06:00:11 crc kubenswrapper[4956]: E0930 06:00:11.198945 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719ab0a4-b97f-412a-8b05-52ae64f6d995" containerName="collect-profiles" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.198967 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="719ab0a4-b97f-412a-8b05-52ae64f6d995" containerName="collect-profiles" Sep 30 06:00:11 crc kubenswrapper[4956]: E0930 06:00:11.198986 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6826cb-1434-4917-a836-ae952394b1ca" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.198996 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6826cb-1434-4917-a836-ae952394b1ca" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.199247 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6826cb-1434-4917-a836-ae952394b1ca" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.199283 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="719ab0a4-b97f-412a-8b05-52ae64f6d995" containerName="collect-profiles" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.199950 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.206894 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.206899 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.207626 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.207844 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.213535 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk"] Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.347256 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.347319 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99c52\" (UniqueName: \"kubernetes.io/projected/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-kube-api-access-99c52\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.347382 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.448737 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99c52\" (UniqueName: \"kubernetes.io/projected/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-kube-api-access-99c52\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.448838 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.449034 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.453343 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.455043 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.463579 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99c52\" (UniqueName: \"kubernetes.io/projected/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-kube-api-access-99c52\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:11 crc kubenswrapper[4956]: I0930 06:00:11.527002 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:12 crc kubenswrapper[4956]: I0930 06:00:12.065670 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk"] Sep 30 06:00:12 crc kubenswrapper[4956]: I0930 06:00:12.133063 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" event={"ID":"17e89a19-be9f-454b-b1c9-8f5b813a9a3b","Type":"ContainerStarted","Data":"8b9b2829ca85bb2ce337ba65dedcb1bdbfcbd0331befa8a6fccb8f09b4cf5e3f"} Sep 30 06:00:13 crc kubenswrapper[4956]: I0930 06:00:13.143587 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" event={"ID":"17e89a19-be9f-454b-b1c9-8f5b813a9a3b","Type":"ContainerStarted","Data":"01139bc3992e0c2f5c4c5ea0180409dca961bba8afe3a1ee8a3f9a7ee91308d9"} Sep 30 06:00:13 crc kubenswrapper[4956]: I0930 06:00:13.176191 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" podStartSLOduration=1.707475858 podStartE2EDuration="2.176170361s" podCreationTimestamp="2025-09-30 06:00:11 +0000 UTC" firstStartedPulling="2025-09-30 06:00:12.07844001 +0000 UTC m=+1882.405560575" lastFinishedPulling="2025-09-30 06:00:12.547134533 +0000 UTC m=+1882.874255078" observedRunningTime="2025-09-30 06:00:13.161960795 +0000 UTC m=+1883.489081330" watchObservedRunningTime="2025-09-30 06:00:13.176170361 +0000 UTC m=+1883.503290896" Sep 30 06:00:23 crc kubenswrapper[4956]: I0930 06:00:23.246486 4956 generic.go:334] "Generic (PLEG): container finished" podID="17e89a19-be9f-454b-b1c9-8f5b813a9a3b" containerID="01139bc3992e0c2f5c4c5ea0180409dca961bba8afe3a1ee8a3f9a7ee91308d9" exitCode=0 Sep 30 06:00:23 crc kubenswrapper[4956]: I0930 06:00:23.246622 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" event={"ID":"17e89a19-be9f-454b-b1c9-8f5b813a9a3b","Type":"ContainerDied","Data":"01139bc3992e0c2f5c4c5ea0180409dca961bba8afe3a1ee8a3f9a7ee91308d9"} Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.617727 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.726466 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99c52\" (UniqueName: \"kubernetes.io/projected/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-kube-api-access-99c52\") pod \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.726577 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-ssh-key\") pod \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.726787 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-inventory\") pod \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\" (UID: \"17e89a19-be9f-454b-b1c9-8f5b813a9a3b\") " Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.732798 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-kube-api-access-99c52" (OuterVolumeSpecName: "kube-api-access-99c52") pod "17e89a19-be9f-454b-b1c9-8f5b813a9a3b" (UID: "17e89a19-be9f-454b-b1c9-8f5b813a9a3b"). InnerVolumeSpecName "kube-api-access-99c52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.761917 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-inventory" (OuterVolumeSpecName: "inventory") pod "17e89a19-be9f-454b-b1c9-8f5b813a9a3b" (UID: "17e89a19-be9f-454b-b1c9-8f5b813a9a3b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.769733 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17e89a19-be9f-454b-b1c9-8f5b813a9a3b" (UID: "17e89a19-be9f-454b-b1c9-8f5b813a9a3b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.828908 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.828939 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99c52\" (UniqueName: \"kubernetes.io/projected/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-kube-api-access-99c52\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:24 crc kubenswrapper[4956]: I0930 06:00:24.828948 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17e89a19-be9f-454b-b1c9-8f5b813a9a3b-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.267503 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" event={"ID":"17e89a19-be9f-454b-b1c9-8f5b813a9a3b","Type":"ContainerDied","Data":"8b9b2829ca85bb2ce337ba65dedcb1bdbfcbd0331befa8a6fccb8f09b4cf5e3f"} Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.267568 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b9b2829ca85bb2ce337ba65dedcb1bdbfcbd0331befa8a6fccb8f09b4cf5e3f" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.267645 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.348394 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp"] Sep 30 06:00:25 crc kubenswrapper[4956]: E0930 06:00:25.348745 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e89a19-be9f-454b-b1c9-8f5b813a9a3b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.348761 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e89a19-be9f-454b-b1c9-8f5b813a9a3b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.348985 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e89a19-be9f-454b-b1c9-8f5b813a9a3b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.349639 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.354230 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.354280 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.354421 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.356031 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.356071 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.356677 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.356830 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.357028 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.368558 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp"] Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443445 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443493 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443598 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443628 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443663 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443706 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443736 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443786 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443830 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443906 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443940 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n29hq\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-kube-api-access-n29hq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.443987 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.444052 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.444074 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.545557 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.545991 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546054 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546102 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546195 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546231 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n29hq\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-kube-api-access-n29hq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546280 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546350 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546386 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546445 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546478 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546557 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546599 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.546648 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.553022 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.560458 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.560665 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.560762 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.561003 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.561701 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.562794 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.564401 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.564485 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.564509 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.564519 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.565060 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.565151 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.568059 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n29hq\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-kube-api-access-n29hq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:25 crc kubenswrapper[4956]: I0930 06:00:25.713299 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:00:26 crc kubenswrapper[4956]: I0930 06:00:26.258793 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp"] Sep 30 06:00:26 crc kubenswrapper[4956]: I0930 06:00:26.277104 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" event={"ID":"278c7daa-7016-4bde-8424-bd0c3491cb3c","Type":"ContainerStarted","Data":"d011ea283a46e8bf07fa3f35c50d072a88a4bf00ac1cf10801b74aec56d2734f"} Sep 30 06:00:27 crc kubenswrapper[4956]: I0930 06:00:27.288761 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" event={"ID":"278c7daa-7016-4bde-8424-bd0c3491cb3c","Type":"ContainerStarted","Data":"bcc1cd87b770ac9e04381d618a0b832e7be324bfdf2c26391df314efcc1c2ccb"} Sep 30 06:00:27 crc kubenswrapper[4956]: I0930 06:00:27.323336 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" podStartSLOduration=1.857694316 podStartE2EDuration="2.323314912s" podCreationTimestamp="2025-09-30 06:00:25 +0000 UTC" firstStartedPulling="2025-09-30 06:00:26.264358957 +0000 UTC m=+1896.591479482" lastFinishedPulling="2025-09-30 06:00:26.729979553 +0000 UTC m=+1897.057100078" observedRunningTime="2025-09-30 06:00:27.314429804 +0000 UTC m=+1897.641550329" watchObservedRunningTime="2025-09-30 06:00:27.323314912 +0000 UTC m=+1897.650435447" Sep 30 06:00:48 crc kubenswrapper[4956]: I0930 06:00:48.073667 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:00:48 crc kubenswrapper[4956]: I0930 06:00:48.074265 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.150640 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320201-dhqp5"] Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.152515 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.158601 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320201-dhqp5"] Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.299421 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-config-data\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.299558 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-combined-ca-bundle\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.299724 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-fernet-keys\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.299755 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75tj6\" (UniqueName: \"kubernetes.io/projected/356a925e-f5c8-48b3-b62c-5c80f7566d01-kube-api-access-75tj6\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.401191 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-combined-ca-bundle\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.401288 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-fernet-keys\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.401309 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75tj6\" (UniqueName: \"kubernetes.io/projected/356a925e-f5c8-48b3-b62c-5c80f7566d01-kube-api-access-75tj6\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.401371 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-config-data\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.407246 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-combined-ca-bundle\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.413448 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-config-data\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.416518 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-fernet-keys\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.424701 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75tj6\" (UniqueName: \"kubernetes.io/projected/356a925e-f5c8-48b3-b62c-5c80f7566d01-kube-api-access-75tj6\") pod \"keystone-cron-29320201-dhqp5\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.490862 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:00 crc kubenswrapper[4956]: I0930 06:01:00.905499 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320201-dhqp5"] Sep 30 06:01:01 crc kubenswrapper[4956]: I0930 06:01:01.642145 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320201-dhqp5" event={"ID":"356a925e-f5c8-48b3-b62c-5c80f7566d01","Type":"ContainerStarted","Data":"97516f0d2e2afa9f4bc0f62bf9fa5fea6a6f2bdd9d504d9b619af849272a91da"} Sep 30 06:01:01 crc kubenswrapper[4956]: I0930 06:01:01.642663 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320201-dhqp5" event={"ID":"356a925e-f5c8-48b3-b62c-5c80f7566d01","Type":"ContainerStarted","Data":"d6730908a705bac2c69b0c29a2b02ad324483f4bc5a4afc077112eb557412008"} Sep 30 06:01:01 crc kubenswrapper[4956]: I0930 06:01:01.666006 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320201-dhqp5" podStartSLOduration=1.66599104 podStartE2EDuration="1.66599104s" podCreationTimestamp="2025-09-30 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:01:01.65708119 +0000 UTC m=+1931.984201735" watchObservedRunningTime="2025-09-30 06:01:01.66599104 +0000 UTC m=+1931.993111565" Sep 30 06:01:03 crc kubenswrapper[4956]: I0930 06:01:03.663684 4956 generic.go:334] "Generic (PLEG): container finished" podID="356a925e-f5c8-48b3-b62c-5c80f7566d01" containerID="97516f0d2e2afa9f4bc0f62bf9fa5fea6a6f2bdd9d504d9b619af849272a91da" exitCode=0 Sep 30 06:01:03 crc kubenswrapper[4956]: I0930 06:01:03.663832 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320201-dhqp5" event={"ID":"356a925e-f5c8-48b3-b62c-5c80f7566d01","Type":"ContainerDied","Data":"97516f0d2e2afa9f4bc0f62bf9fa5fea6a6f2bdd9d504d9b619af849272a91da"} Sep 30 06:01:04 crc kubenswrapper[4956]: I0930 06:01:04.983484 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.094345 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75tj6\" (UniqueName: \"kubernetes.io/projected/356a925e-f5c8-48b3-b62c-5c80f7566d01-kube-api-access-75tj6\") pod \"356a925e-f5c8-48b3-b62c-5c80f7566d01\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.094431 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-fernet-keys\") pod \"356a925e-f5c8-48b3-b62c-5c80f7566d01\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.094545 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-config-data\") pod \"356a925e-f5c8-48b3-b62c-5c80f7566d01\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.094881 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-combined-ca-bundle\") pod \"356a925e-f5c8-48b3-b62c-5c80f7566d01\" (UID: \"356a925e-f5c8-48b3-b62c-5c80f7566d01\") " Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.112882 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "356a925e-f5c8-48b3-b62c-5c80f7566d01" (UID: "356a925e-f5c8-48b3-b62c-5c80f7566d01"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.115065 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356a925e-f5c8-48b3-b62c-5c80f7566d01-kube-api-access-75tj6" (OuterVolumeSpecName: "kube-api-access-75tj6") pod "356a925e-f5c8-48b3-b62c-5c80f7566d01" (UID: "356a925e-f5c8-48b3-b62c-5c80f7566d01"). InnerVolumeSpecName "kube-api-access-75tj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.122790 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "356a925e-f5c8-48b3-b62c-5c80f7566d01" (UID: "356a925e-f5c8-48b3-b62c-5c80f7566d01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.148413 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-config-data" (OuterVolumeSpecName: "config-data") pod "356a925e-f5c8-48b3-b62c-5c80f7566d01" (UID: "356a925e-f5c8-48b3-b62c-5c80f7566d01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.197015 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.197267 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.197383 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75tj6\" (UniqueName: \"kubernetes.io/projected/356a925e-f5c8-48b3-b62c-5c80f7566d01-kube-api-access-75tj6\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.197471 4956 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/356a925e-f5c8-48b3-b62c-5c80f7566d01-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.690023 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320201-dhqp5" Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.689916 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320201-dhqp5" event={"ID":"356a925e-f5c8-48b3-b62c-5c80f7566d01","Type":"ContainerDied","Data":"d6730908a705bac2c69b0c29a2b02ad324483f4bc5a4afc077112eb557412008"} Sep 30 06:01:05 crc kubenswrapper[4956]: I0930 06:01:05.691352 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6730908a705bac2c69b0c29a2b02ad324483f4bc5a4afc077112eb557412008" Sep 30 06:01:10 crc kubenswrapper[4956]: I0930 06:01:10.735986 4956 generic.go:334] "Generic (PLEG): container finished" podID="278c7daa-7016-4bde-8424-bd0c3491cb3c" containerID="bcc1cd87b770ac9e04381d618a0b832e7be324bfdf2c26391df314efcc1c2ccb" exitCode=0 Sep 30 06:01:10 crc kubenswrapper[4956]: I0930 06:01:10.736100 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" event={"ID":"278c7daa-7016-4bde-8424-bd0c3491cb3c","Type":"ContainerDied","Data":"bcc1cd87b770ac9e04381d618a0b832e7be324bfdf2c26391df314efcc1c2ccb"} Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.157971 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.232709 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-neutron-metadata-combined-ca-bundle\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.232971 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ovn-combined-ca-bundle\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.232992 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233028 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-libvirt-combined-ca-bundle\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233061 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n29hq\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-kube-api-access-n29hq\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233092 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233165 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-bootstrap-combined-ca-bundle\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233187 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-repo-setup-combined-ca-bundle\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233237 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-nova-combined-ca-bundle\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233262 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233328 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-inventory\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233353 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233434 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-telemetry-combined-ca-bundle\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.233455 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.239595 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.240404 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.240517 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.243572 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.243606 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.243638 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.244151 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.244530 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.244580 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-kube-api-access-n29hq" (OuterVolumeSpecName: "kube-api-access-n29hq") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "kube-api-access-n29hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.245192 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.245257 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.252393 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: E0930 06:01:12.263497 4956 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key podName:278c7daa-7016-4bde-8424-bd0c3491cb3c nodeName:}" failed. No retries permitted until 2025-09-30 06:01:12.763467586 +0000 UTC m=+1943.090588131 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c") : error deleting /var/lib/kubelet/pods/278c7daa-7016-4bde-8424-bd0c3491cb3c/volume-subpaths: remove /var/lib/kubelet/pods/278c7daa-7016-4bde-8424-bd0c3491cb3c/volume-subpaths: no such file or directory Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.265633 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-inventory" (OuterVolumeSpecName: "inventory") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336091 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n29hq\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-kube-api-access-n29hq\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336157 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336174 4956 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336187 4956 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336199 4956 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336211 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336224 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336236 4956 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336248 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336259 4956 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336271 4956 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336282 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/278c7daa-7016-4bde-8424-bd0c3491cb3c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.336295 4956 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.754785 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" event={"ID":"278c7daa-7016-4bde-8424-bd0c3491cb3c","Type":"ContainerDied","Data":"d011ea283a46e8bf07fa3f35c50d072a88a4bf00ac1cf10801b74aec56d2734f"} Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.754830 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d011ea283a46e8bf07fa3f35c50d072a88a4bf00ac1cf10801b74aec56d2734f" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.755179 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.845397 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key\") pod \"278c7daa-7016-4bde-8424-bd0c3491cb3c\" (UID: \"278c7daa-7016-4bde-8424-bd0c3491cb3c\") " Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.851483 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "278c7daa-7016-4bde-8424-bd0c3491cb3c" (UID: "278c7daa-7016-4bde-8424-bd0c3491cb3c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.925513 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p"] Sep 30 06:01:12 crc kubenswrapper[4956]: E0930 06:01:12.925995 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356a925e-f5c8-48b3-b62c-5c80f7566d01" containerName="keystone-cron" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.926016 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="356a925e-f5c8-48b3-b62c-5c80f7566d01" containerName="keystone-cron" Sep 30 06:01:12 crc kubenswrapper[4956]: E0930 06:01:12.926067 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c7daa-7016-4bde-8424-bd0c3491cb3c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.926078 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c7daa-7016-4bde-8424-bd0c3491cb3c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.926345 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="356a925e-f5c8-48b3-b62c-5c80f7566d01" containerName="keystone-cron" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.926380 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="278c7daa-7016-4bde-8424-bd0c3491cb3c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.927040 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.930473 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.947556 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p"] Sep 30 06:01:12 crc kubenswrapper[4956]: I0930 06:01:12.948158 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/278c7daa-7016-4bde-8424-bd0c3491cb3c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.049711 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.050059 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.050104 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x2bp\" (UniqueName: \"kubernetes.io/projected/a6c38cbf-b86e-473b-8f78-b917dc31d239-kube-api-access-4x2bp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.050155 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.050419 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.151992 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.152045 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x2bp\" (UniqueName: \"kubernetes.io/projected/a6c38cbf-b86e-473b-8f78-b917dc31d239-kube-api-access-4x2bp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.152069 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.152141 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.152207 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.152963 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.155741 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.156190 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.156849 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.174184 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x2bp\" (UniqueName: \"kubernetes.io/projected/a6c38cbf-b86e-473b-8f78-b917dc31d239-kube-api-access-4x2bp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fb24p\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.255746 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:01:13 crc kubenswrapper[4956]: I0930 06:01:13.852394 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p"] Sep 30 06:01:14 crc kubenswrapper[4956]: I0930 06:01:14.774538 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" event={"ID":"a6c38cbf-b86e-473b-8f78-b917dc31d239","Type":"ContainerStarted","Data":"c88bf046548796a2a73d1c43baa2a64c84a4d7dc1a8044aa0f0fad2471d112ac"} Sep 30 06:01:14 crc kubenswrapper[4956]: I0930 06:01:14.775412 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" event={"ID":"a6c38cbf-b86e-473b-8f78-b917dc31d239","Type":"ContainerStarted","Data":"04fdb7c0d93fc1a6bfdfbd4406047ae5398fe87e2199e25ed623ab2f7dd12ddd"} Sep 30 06:01:14 crc kubenswrapper[4956]: I0930 06:01:14.792427 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" podStartSLOduration=2.340899778 podStartE2EDuration="2.792407052s" podCreationTimestamp="2025-09-30 06:01:12 +0000 UTC" firstStartedPulling="2025-09-30 06:01:13.85732648 +0000 UTC m=+1944.184447005" lastFinishedPulling="2025-09-30 06:01:14.308833754 +0000 UTC m=+1944.635954279" observedRunningTime="2025-09-30 06:01:14.787163759 +0000 UTC m=+1945.114284304" watchObservedRunningTime="2025-09-30 06:01:14.792407052 +0000 UTC m=+1945.119527577" Sep 30 06:01:18 crc kubenswrapper[4956]: I0930 06:01:18.073593 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:01:18 crc kubenswrapper[4956]: I0930 06:01:18.074713 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:01:48 crc kubenswrapper[4956]: I0930 06:01:48.073922 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:01:48 crc kubenswrapper[4956]: I0930 06:01:48.074612 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:01:48 crc kubenswrapper[4956]: I0930 06:01:48.074678 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:01:48 crc kubenswrapper[4956]: I0930 06:01:48.075721 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0dfc54b32590324bdbf7360f23fe87e3bb3d8dad1586a2b8c737285b6be9a13a"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:01:48 crc kubenswrapper[4956]: I0930 06:01:48.075810 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://0dfc54b32590324bdbf7360f23fe87e3bb3d8dad1586a2b8c737285b6be9a13a" gracePeriod=600 Sep 30 06:01:49 crc kubenswrapper[4956]: I0930 06:01:49.149716 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="0dfc54b32590324bdbf7360f23fe87e3bb3d8dad1586a2b8c737285b6be9a13a" exitCode=0 Sep 30 06:01:49 crc kubenswrapper[4956]: I0930 06:01:49.149794 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"0dfc54b32590324bdbf7360f23fe87e3bb3d8dad1586a2b8c737285b6be9a13a"} Sep 30 06:01:49 crc kubenswrapper[4956]: I0930 06:01:49.150273 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894"} Sep 30 06:01:49 crc kubenswrapper[4956]: I0930 06:01:49.150306 4956 scope.go:117] "RemoveContainer" containerID="e4a6a76fd353a301f6e07c1be4c7acb168261a9686715385c4d981d26714303b" Sep 30 06:02:30 crc kubenswrapper[4956]: I0930 06:02:30.561368 4956 generic.go:334] "Generic (PLEG): container finished" podID="a6c38cbf-b86e-473b-8f78-b917dc31d239" containerID="c88bf046548796a2a73d1c43baa2a64c84a4d7dc1a8044aa0f0fad2471d112ac" exitCode=0 Sep 30 06:02:30 crc kubenswrapper[4956]: I0930 06:02:30.561502 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" event={"ID":"a6c38cbf-b86e-473b-8f78-b917dc31d239","Type":"ContainerDied","Data":"c88bf046548796a2a73d1c43baa2a64c84a4d7dc1a8044aa0f0fad2471d112ac"} Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.045074 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.173133 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovn-combined-ca-bundle\") pod \"a6c38cbf-b86e-473b-8f78-b917dc31d239\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.173415 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ssh-key\") pod \"a6c38cbf-b86e-473b-8f78-b917dc31d239\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.173524 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-inventory\") pod \"a6c38cbf-b86e-473b-8f78-b917dc31d239\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.173572 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x2bp\" (UniqueName: \"kubernetes.io/projected/a6c38cbf-b86e-473b-8f78-b917dc31d239-kube-api-access-4x2bp\") pod \"a6c38cbf-b86e-473b-8f78-b917dc31d239\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.173689 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovncontroller-config-0\") pod \"a6c38cbf-b86e-473b-8f78-b917dc31d239\" (UID: \"a6c38cbf-b86e-473b-8f78-b917dc31d239\") " Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.179141 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c38cbf-b86e-473b-8f78-b917dc31d239-kube-api-access-4x2bp" (OuterVolumeSpecName: "kube-api-access-4x2bp") pod "a6c38cbf-b86e-473b-8f78-b917dc31d239" (UID: "a6c38cbf-b86e-473b-8f78-b917dc31d239"). InnerVolumeSpecName "kube-api-access-4x2bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.179241 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a6c38cbf-b86e-473b-8f78-b917dc31d239" (UID: "a6c38cbf-b86e-473b-8f78-b917dc31d239"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.200698 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a6c38cbf-b86e-473b-8f78-b917dc31d239" (UID: "a6c38cbf-b86e-473b-8f78-b917dc31d239"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.202246 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a6c38cbf-b86e-473b-8f78-b917dc31d239" (UID: "a6c38cbf-b86e-473b-8f78-b917dc31d239"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.206321 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-inventory" (OuterVolumeSpecName: "inventory") pod "a6c38cbf-b86e-473b-8f78-b917dc31d239" (UID: "a6c38cbf-b86e-473b-8f78-b917dc31d239"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.275886 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.275919 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.275930 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x2bp\" (UniqueName: \"kubernetes.io/projected/a6c38cbf-b86e-473b-8f78-b917dc31d239-kube-api-access-4x2bp\") on node \"crc\" DevicePath \"\"" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.275942 4956 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.275952 4956 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c38cbf-b86e-473b-8f78-b917dc31d239-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.580657 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" event={"ID":"a6c38cbf-b86e-473b-8f78-b917dc31d239","Type":"ContainerDied","Data":"04fdb7c0d93fc1a6bfdfbd4406047ae5398fe87e2199e25ed623ab2f7dd12ddd"} Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.580699 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04fdb7c0d93fc1a6bfdfbd4406047ae5398fe87e2199e25ed623ab2f7dd12ddd" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.580697 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fb24p" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.673839 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c"] Sep 30 06:02:32 crc kubenswrapper[4956]: E0930 06:02:32.674826 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c38cbf-b86e-473b-8f78-b917dc31d239" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.674847 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c38cbf-b86e-473b-8f78-b917dc31d239" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.675081 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c38cbf-b86e-473b-8f78-b917dc31d239" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.691109 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.699576 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.699799 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.699936 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.700196 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.700527 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.702962 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.707930 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c"] Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.789924 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.790001 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.790022 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqrdc\" (UniqueName: \"kubernetes.io/projected/f99ede59-582c-4ed5-99e4-6ba65d66aedb-kube-api-access-vqrdc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.790052 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.790071 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.790176 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.892065 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.892199 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.892245 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.892266 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqrdc\" (UniqueName: \"kubernetes.io/projected/f99ede59-582c-4ed5-99e4-6ba65d66aedb-kube-api-access-vqrdc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.892296 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.892317 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.897751 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.898172 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.898762 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.902769 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.906205 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:32 crc kubenswrapper[4956]: I0930 06:02:32.909231 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqrdc\" (UniqueName: \"kubernetes.io/projected/f99ede59-582c-4ed5-99e4-6ba65d66aedb-kube-api-access-vqrdc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:33 crc kubenswrapper[4956]: I0930 06:02:33.034136 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:02:33 crc kubenswrapper[4956]: I0930 06:02:33.563464 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c"] Sep 30 06:02:33 crc kubenswrapper[4956]: I0930 06:02:33.591205 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" event={"ID":"f99ede59-582c-4ed5-99e4-6ba65d66aedb","Type":"ContainerStarted","Data":"fd101e0b848111a1ef5a7bebb9a857386e11c3f174ae6e5a44cf7efaa1aa4c24"} Sep 30 06:02:34 crc kubenswrapper[4956]: I0930 06:02:34.601026 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" event={"ID":"f99ede59-582c-4ed5-99e4-6ba65d66aedb","Type":"ContainerStarted","Data":"9ddec3c8d4a13fd8f247a896e66fdc25f974637cbb6377fc142a4db32e24e107"} Sep 30 06:02:34 crc kubenswrapper[4956]: I0930 06:02:34.622184 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" podStartSLOduration=2.112284114 podStartE2EDuration="2.622098396s" podCreationTimestamp="2025-09-30 06:02:32 +0000 UTC" firstStartedPulling="2025-09-30 06:02:33.563475121 +0000 UTC m=+2023.890595646" lastFinishedPulling="2025-09-30 06:02:34.073289393 +0000 UTC m=+2024.400409928" observedRunningTime="2025-09-30 06:02:34.615466029 +0000 UTC m=+2024.942586574" watchObservedRunningTime="2025-09-30 06:02:34.622098396 +0000 UTC m=+2024.949218921" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.365224 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4rmr9"] Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.372858 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.381400 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rmr9"] Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.500434 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-utilities\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.500483 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-catalog-content\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.500626 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqxb\" (UniqueName: \"kubernetes.io/projected/4549ee1c-7530-439b-a49e-19db4165108f-kube-api-access-zsqxb\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.602645 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-utilities\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.602698 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-catalog-content\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.602811 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqxb\" (UniqueName: \"kubernetes.io/projected/4549ee1c-7530-439b-a49e-19db4165108f-kube-api-access-zsqxb\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.603146 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-utilities\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.603222 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-catalog-content\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.630350 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqxb\" (UniqueName: \"kubernetes.io/projected/4549ee1c-7530-439b-a49e-19db4165108f-kube-api-access-zsqxb\") pod \"community-operators-4rmr9\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:38 crc kubenswrapper[4956]: I0930 06:02:38.699563 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:39 crc kubenswrapper[4956]: I0930 06:02:39.254880 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rmr9"] Sep 30 06:02:39 crc kubenswrapper[4956]: I0930 06:02:39.652380 4956 generic.go:334] "Generic (PLEG): container finished" podID="4549ee1c-7530-439b-a49e-19db4165108f" containerID="ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6" exitCode=0 Sep 30 06:02:39 crc kubenswrapper[4956]: I0930 06:02:39.652434 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rmr9" event={"ID":"4549ee1c-7530-439b-a49e-19db4165108f","Type":"ContainerDied","Data":"ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6"} Sep 30 06:02:39 crc kubenswrapper[4956]: I0930 06:02:39.652461 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rmr9" event={"ID":"4549ee1c-7530-439b-a49e-19db4165108f","Type":"ContainerStarted","Data":"bb9975560928709b1ea58b2af4e29bdda4211381a6c353c229b5a33a8be2f62a"} Sep 30 06:02:40 crc kubenswrapper[4956]: I0930 06:02:40.663067 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rmr9" event={"ID":"4549ee1c-7530-439b-a49e-19db4165108f","Type":"ContainerStarted","Data":"04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d"} Sep 30 06:02:42 crc kubenswrapper[4956]: I0930 06:02:42.685149 4956 generic.go:334] "Generic (PLEG): container finished" podID="4549ee1c-7530-439b-a49e-19db4165108f" containerID="04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d" exitCode=0 Sep 30 06:02:42 crc kubenswrapper[4956]: I0930 06:02:42.685217 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rmr9" event={"ID":"4549ee1c-7530-439b-a49e-19db4165108f","Type":"ContainerDied","Data":"04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d"} Sep 30 06:02:43 crc kubenswrapper[4956]: I0930 06:02:43.719869 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rmr9" event={"ID":"4549ee1c-7530-439b-a49e-19db4165108f","Type":"ContainerStarted","Data":"8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf"} Sep 30 06:02:43 crc kubenswrapper[4956]: I0930 06:02:43.750346 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4rmr9" podStartSLOduration=2.3204725379999998 podStartE2EDuration="5.750323276s" podCreationTimestamp="2025-09-30 06:02:38 +0000 UTC" firstStartedPulling="2025-09-30 06:02:39.655014697 +0000 UTC m=+2029.982135252" lastFinishedPulling="2025-09-30 06:02:43.084865465 +0000 UTC m=+2033.411985990" observedRunningTime="2025-09-30 06:02:43.741159459 +0000 UTC m=+2034.068280014" watchObservedRunningTime="2025-09-30 06:02:43.750323276 +0000 UTC m=+2034.077443801" Sep 30 06:02:48 crc kubenswrapper[4956]: I0930 06:02:48.700354 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:48 crc kubenswrapper[4956]: I0930 06:02:48.700999 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:48 crc kubenswrapper[4956]: I0930 06:02:48.775433 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:48 crc kubenswrapper[4956]: I0930 06:02:48.852199 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:49 crc kubenswrapper[4956]: I0930 06:02:49.019309 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rmr9"] Sep 30 06:02:50 crc kubenswrapper[4956]: I0930 06:02:50.805538 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4rmr9" podUID="4549ee1c-7530-439b-a49e-19db4165108f" containerName="registry-server" containerID="cri-o://8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf" gracePeriod=2 Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.235330 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.350542 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-utilities\") pod \"4549ee1c-7530-439b-a49e-19db4165108f\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.350611 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsqxb\" (UniqueName: \"kubernetes.io/projected/4549ee1c-7530-439b-a49e-19db4165108f-kube-api-access-zsqxb\") pod \"4549ee1c-7530-439b-a49e-19db4165108f\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.350898 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-catalog-content\") pod \"4549ee1c-7530-439b-a49e-19db4165108f\" (UID: \"4549ee1c-7530-439b-a49e-19db4165108f\") " Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.351334 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-utilities" (OuterVolumeSpecName: "utilities") pod "4549ee1c-7530-439b-a49e-19db4165108f" (UID: "4549ee1c-7530-439b-a49e-19db4165108f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.357054 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4549ee1c-7530-439b-a49e-19db4165108f-kube-api-access-zsqxb" (OuterVolumeSpecName: "kube-api-access-zsqxb") pod "4549ee1c-7530-439b-a49e-19db4165108f" (UID: "4549ee1c-7530-439b-a49e-19db4165108f"). InnerVolumeSpecName "kube-api-access-zsqxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.398753 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4549ee1c-7530-439b-a49e-19db4165108f" (UID: "4549ee1c-7530-439b-a49e-19db4165108f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.452893 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.452922 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4549ee1c-7530-439b-a49e-19db4165108f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.452934 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsqxb\" (UniqueName: \"kubernetes.io/projected/4549ee1c-7530-439b-a49e-19db4165108f-kube-api-access-zsqxb\") on node \"crc\" DevicePath \"\"" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.814765 4956 generic.go:334] "Generic (PLEG): container finished" podID="4549ee1c-7530-439b-a49e-19db4165108f" containerID="8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf" exitCode=0 Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.814804 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rmr9" event={"ID":"4549ee1c-7530-439b-a49e-19db4165108f","Type":"ContainerDied","Data":"8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf"} Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.814831 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rmr9" event={"ID":"4549ee1c-7530-439b-a49e-19db4165108f","Type":"ContainerDied","Data":"bb9975560928709b1ea58b2af4e29bdda4211381a6c353c229b5a33a8be2f62a"} Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.814847 4956 scope.go:117] "RemoveContainer" containerID="8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.814955 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rmr9" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.845967 4956 scope.go:117] "RemoveContainer" containerID="04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.848937 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rmr9"] Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.859571 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4rmr9"] Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.876962 4956 scope.go:117] "RemoveContainer" containerID="ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.921280 4956 scope.go:117] "RemoveContainer" containerID="8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf" Sep 30 06:02:51 crc kubenswrapper[4956]: E0930 06:02:51.921807 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf\": container with ID starting with 8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf not found: ID does not exist" containerID="8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.921869 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf"} err="failed to get container status \"8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf\": rpc error: code = NotFound desc = could not find container \"8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf\": container with ID starting with 8c7dad9c1a71af929c9f570cda9e1be853d91397d38912d31952643bea4d55bf not found: ID does not exist" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.921899 4956 scope.go:117] "RemoveContainer" containerID="04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d" Sep 30 06:02:51 crc kubenswrapper[4956]: E0930 06:02:51.922585 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d\": container with ID starting with 04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d not found: ID does not exist" containerID="04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.922613 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d"} err="failed to get container status \"04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d\": rpc error: code = NotFound desc = could not find container \"04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d\": container with ID starting with 04a6d3e7a1f4da7b23c75d37c48c7ca0e838380528722988df5bd5236891f83d not found: ID does not exist" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.922628 4956 scope.go:117] "RemoveContainer" containerID="ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6" Sep 30 06:02:51 crc kubenswrapper[4956]: E0930 06:02:51.922946 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6\": container with ID starting with ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6 not found: ID does not exist" containerID="ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6" Sep 30 06:02:51 crc kubenswrapper[4956]: I0930 06:02:51.922988 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6"} err="failed to get container status \"ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6\": rpc error: code = NotFound desc = could not find container \"ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6\": container with ID starting with ae17e06a5e9b701ba5dd8b6aa9f22991e0f85600365c7f86c51c55fe874474b6 not found: ID does not exist" Sep 30 06:02:52 crc kubenswrapper[4956]: I0930 06:02:52.360442 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4549ee1c-7530-439b-a49e-19db4165108f" path="/var/lib/kubelet/pods/4549ee1c-7530-439b-a49e-19db4165108f/volumes" Sep 30 06:03:32 crc kubenswrapper[4956]: I0930 06:03:32.244411 4956 generic.go:334] "Generic (PLEG): container finished" podID="f99ede59-582c-4ed5-99e4-6ba65d66aedb" containerID="9ddec3c8d4a13fd8f247a896e66fdc25f974637cbb6377fc142a4db32e24e107" exitCode=0 Sep 30 06:03:32 crc kubenswrapper[4956]: I0930 06:03:32.244484 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" event={"ID":"f99ede59-582c-4ed5-99e4-6ba65d66aedb","Type":"ContainerDied","Data":"9ddec3c8d4a13fd8f247a896e66fdc25f974637cbb6377fc142a4db32e24e107"} Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.657097 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.799079 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-inventory\") pod \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.799274 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.799299 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-ssh-key\") pod \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.799325 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-nova-metadata-neutron-config-0\") pod \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.799427 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqrdc\" (UniqueName: \"kubernetes.io/projected/f99ede59-582c-4ed5-99e4-6ba65d66aedb-kube-api-access-vqrdc\") pod \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.799471 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-metadata-combined-ca-bundle\") pod \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\" (UID: \"f99ede59-582c-4ed5-99e4-6ba65d66aedb\") " Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.809720 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f99ede59-582c-4ed5-99e4-6ba65d66aedb" (UID: "f99ede59-582c-4ed5-99e4-6ba65d66aedb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.809858 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99ede59-582c-4ed5-99e4-6ba65d66aedb-kube-api-access-vqrdc" (OuterVolumeSpecName: "kube-api-access-vqrdc") pod "f99ede59-582c-4ed5-99e4-6ba65d66aedb" (UID: "f99ede59-582c-4ed5-99e4-6ba65d66aedb"). InnerVolumeSpecName "kube-api-access-vqrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.826661 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f99ede59-582c-4ed5-99e4-6ba65d66aedb" (UID: "f99ede59-582c-4ed5-99e4-6ba65d66aedb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.831931 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f99ede59-582c-4ed5-99e4-6ba65d66aedb" (UID: "f99ede59-582c-4ed5-99e4-6ba65d66aedb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.833365 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f99ede59-582c-4ed5-99e4-6ba65d66aedb" (UID: "f99ede59-582c-4ed5-99e4-6ba65d66aedb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.833389 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-inventory" (OuterVolumeSpecName: "inventory") pod "f99ede59-582c-4ed5-99e4-6ba65d66aedb" (UID: "f99ede59-582c-4ed5-99e4-6ba65d66aedb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.902350 4956 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.902388 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.902401 4956 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.902416 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqrdc\" (UniqueName: \"kubernetes.io/projected/f99ede59-582c-4ed5-99e4-6ba65d66aedb-kube-api-access-vqrdc\") on node \"crc\" DevicePath \"\"" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.902429 4956 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:03:33 crc kubenswrapper[4956]: I0930 06:03:33.902444 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f99ede59-582c-4ed5-99e4-6ba65d66aedb-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.264029 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" event={"ID":"f99ede59-582c-4ed5-99e4-6ba65d66aedb","Type":"ContainerDied","Data":"fd101e0b848111a1ef5a7bebb9a857386e11c3f174ae6e5a44cf7efaa1aa4c24"} Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.264347 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd101e0b848111a1ef5a7bebb9a857386e11c3f174ae6e5a44cf7efaa1aa4c24" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.264404 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.425389 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk"] Sep 30 06:03:34 crc kubenswrapper[4956]: E0930 06:03:34.425923 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4549ee1c-7530-439b-a49e-19db4165108f" containerName="registry-server" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.425944 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4549ee1c-7530-439b-a49e-19db4165108f" containerName="registry-server" Sep 30 06:03:34 crc kubenswrapper[4956]: E0930 06:03:34.425987 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4549ee1c-7530-439b-a49e-19db4165108f" containerName="extract-utilities" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.425997 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4549ee1c-7530-439b-a49e-19db4165108f" containerName="extract-utilities" Sep 30 06:03:34 crc kubenswrapper[4956]: E0930 06:03:34.426013 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99ede59-582c-4ed5-99e4-6ba65d66aedb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.426023 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99ede59-582c-4ed5-99e4-6ba65d66aedb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 06:03:34 crc kubenswrapper[4956]: E0930 06:03:34.426056 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4549ee1c-7530-439b-a49e-19db4165108f" containerName="extract-content" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.426063 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4549ee1c-7530-439b-a49e-19db4165108f" containerName="extract-content" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.426304 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4549ee1c-7530-439b-a49e-19db4165108f" containerName="registry-server" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.426326 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99ede59-582c-4ed5-99e4-6ba65d66aedb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.427174 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.429250 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.429990 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.429997 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.430043 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.430414 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.433492 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk"] Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.614519 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.614654 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.614703 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.614835 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.614878 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpt2k\" (UniqueName: \"kubernetes.io/projected/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-kube-api-access-vpt2k\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.717025 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.717209 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.717245 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt2k\" (UniqueName: \"kubernetes.io/projected/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-kube-api-access-vpt2k\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.717277 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.717331 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.723760 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.724439 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.726158 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.733639 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.737829 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpt2k\" (UniqueName: \"kubernetes.io/projected/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-kube-api-access-vpt2k\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-scntk\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:34 crc kubenswrapper[4956]: I0930 06:03:34.746546 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:03:35 crc kubenswrapper[4956]: I0930 06:03:35.262883 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk"] Sep 30 06:03:35 crc kubenswrapper[4956]: I0930 06:03:35.277163 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" event={"ID":"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4","Type":"ContainerStarted","Data":"e467722f0d6a98213783b7b14bd6580da1a45bff48f315e6b1b569d7f1f8e8e7"} Sep 30 06:03:36 crc kubenswrapper[4956]: I0930 06:03:36.288706 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" event={"ID":"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4","Type":"ContainerStarted","Data":"ec7f82e9d856e6dfb313ed68a13dad6857d80b75193134327b9841b80a98296c"} Sep 30 06:03:36 crc kubenswrapper[4956]: I0930 06:03:36.313290 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" podStartSLOduration=1.732824863 podStartE2EDuration="2.313268048s" podCreationTimestamp="2025-09-30 06:03:34 +0000 UTC" firstStartedPulling="2025-09-30 06:03:35.266844895 +0000 UTC m=+2085.593965420" lastFinishedPulling="2025-09-30 06:03:35.84728808 +0000 UTC m=+2086.174408605" observedRunningTime="2025-09-30 06:03:36.302663995 +0000 UTC m=+2086.629784520" watchObservedRunningTime="2025-09-30 06:03:36.313268048 +0000 UTC m=+2086.640388573" Sep 30 06:03:48 crc kubenswrapper[4956]: I0930 06:03:48.073670 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:03:48 crc kubenswrapper[4956]: I0930 06:03:48.074243 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:04:18 crc kubenswrapper[4956]: I0930 06:04:18.073338 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:04:18 crc kubenswrapper[4956]: I0930 06:04:18.073876 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.340652 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gxdjr"] Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.344410 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.355839 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gxdjr"] Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.494552 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-catalog-content\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.494888 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz7g5\" (UniqueName: \"kubernetes.io/projected/d68bd411-7b28-4405-b7d2-5f7b40237b51-kube-api-access-dz7g5\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.495060 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-utilities\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.596984 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-catalog-content\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.597042 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz7g5\" (UniqueName: \"kubernetes.io/projected/d68bd411-7b28-4405-b7d2-5f7b40237b51-kube-api-access-dz7g5\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.597068 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-utilities\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.597838 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-utilities\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.598224 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-catalog-content\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.618956 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz7g5\" (UniqueName: \"kubernetes.io/projected/d68bd411-7b28-4405-b7d2-5f7b40237b51-kube-api-access-dz7g5\") pod \"certified-operators-gxdjr\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:33 crc kubenswrapper[4956]: I0930 06:04:33.669923 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:34 crc kubenswrapper[4956]: I0930 06:04:34.124338 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gxdjr"] Sep 30 06:04:34 crc kubenswrapper[4956]: I0930 06:04:34.871777 4956 generic.go:334] "Generic (PLEG): container finished" podID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerID="46a5b699e8f772ae6ae7905c908e1e9b2409f6b313fb1ffc760d748b599abb92" exitCode=0 Sep 30 06:04:34 crc kubenswrapper[4956]: I0930 06:04:34.871830 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxdjr" event={"ID":"d68bd411-7b28-4405-b7d2-5f7b40237b51","Type":"ContainerDied","Data":"46a5b699e8f772ae6ae7905c908e1e9b2409f6b313fb1ffc760d748b599abb92"} Sep 30 06:04:34 crc kubenswrapper[4956]: I0930 06:04:34.871863 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxdjr" event={"ID":"d68bd411-7b28-4405-b7d2-5f7b40237b51","Type":"ContainerStarted","Data":"55342c80e9262429961df3de757939e4f428ada22b2df6245a3b3d91627cd1c3"} Sep 30 06:04:35 crc kubenswrapper[4956]: I0930 06:04:35.882912 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxdjr" event={"ID":"d68bd411-7b28-4405-b7d2-5f7b40237b51","Type":"ContainerStarted","Data":"6532f49686669573cd7f457a9e7c891882c2f146fbbb67e014b824f73a837510"} Sep 30 06:04:36 crc kubenswrapper[4956]: I0930 06:04:36.892620 4956 generic.go:334] "Generic (PLEG): container finished" podID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerID="6532f49686669573cd7f457a9e7c891882c2f146fbbb67e014b824f73a837510" exitCode=0 Sep 30 06:04:36 crc kubenswrapper[4956]: I0930 06:04:36.892668 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxdjr" event={"ID":"d68bd411-7b28-4405-b7d2-5f7b40237b51","Type":"ContainerDied","Data":"6532f49686669573cd7f457a9e7c891882c2f146fbbb67e014b824f73a837510"} Sep 30 06:04:37 crc kubenswrapper[4956]: I0930 06:04:37.903084 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxdjr" event={"ID":"d68bd411-7b28-4405-b7d2-5f7b40237b51","Type":"ContainerStarted","Data":"1809ff8c1a5be636962b5285f3fe124995c7753a15e0b287d36ae71f802936a9"} Sep 30 06:04:37 crc kubenswrapper[4956]: I0930 06:04:37.932512 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gxdjr" podStartSLOduration=2.181379335 podStartE2EDuration="4.932490157s" podCreationTimestamp="2025-09-30 06:04:33 +0000 UTC" firstStartedPulling="2025-09-30 06:04:34.873401273 +0000 UTC m=+2145.200521798" lastFinishedPulling="2025-09-30 06:04:37.624512095 +0000 UTC m=+2147.951632620" observedRunningTime="2025-09-30 06:04:37.922506544 +0000 UTC m=+2148.249627079" watchObservedRunningTime="2025-09-30 06:04:37.932490157 +0000 UTC m=+2148.259610692" Sep 30 06:04:43 crc kubenswrapper[4956]: I0930 06:04:43.671724 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:43 crc kubenswrapper[4956]: I0930 06:04:43.672439 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:43 crc kubenswrapper[4956]: I0930 06:04:43.747935 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:44 crc kubenswrapper[4956]: I0930 06:04:44.046701 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:44 crc kubenswrapper[4956]: I0930 06:04:44.130867 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gxdjr"] Sep 30 06:04:45 crc kubenswrapper[4956]: I0930 06:04:45.994103 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gxdjr" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerName="registry-server" containerID="cri-o://1809ff8c1a5be636962b5285f3fe124995c7753a15e0b287d36ae71f802936a9" gracePeriod=2 Sep 30 06:04:46 crc kubenswrapper[4956]: E0930 06:04:46.078842 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd68bd411_7b28_4405_b7d2_5f7b40237b51.slice/crio-1809ff8c1a5be636962b5285f3fe124995c7753a15e0b287d36ae71f802936a9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.016306 4956 generic.go:334] "Generic (PLEG): container finished" podID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerID="1809ff8c1a5be636962b5285f3fe124995c7753a15e0b287d36ae71f802936a9" exitCode=0 Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.016403 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxdjr" event={"ID":"d68bd411-7b28-4405-b7d2-5f7b40237b51","Type":"ContainerDied","Data":"1809ff8c1a5be636962b5285f3fe124995c7753a15e0b287d36ae71f802936a9"} Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.016440 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gxdjr" event={"ID":"d68bd411-7b28-4405-b7d2-5f7b40237b51","Type":"ContainerDied","Data":"55342c80e9262429961df3de757939e4f428ada22b2df6245a3b3d91627cd1c3"} Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.016470 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55342c80e9262429961df3de757939e4f428ada22b2df6245a3b3d91627cd1c3" Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.066143 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.135412 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz7g5\" (UniqueName: \"kubernetes.io/projected/d68bd411-7b28-4405-b7d2-5f7b40237b51-kube-api-access-dz7g5\") pod \"d68bd411-7b28-4405-b7d2-5f7b40237b51\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.135538 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-utilities\") pod \"d68bd411-7b28-4405-b7d2-5f7b40237b51\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.135837 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-catalog-content\") pod \"d68bd411-7b28-4405-b7d2-5f7b40237b51\" (UID: \"d68bd411-7b28-4405-b7d2-5f7b40237b51\") " Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.136283 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-utilities" (OuterVolumeSpecName: "utilities") pod "d68bd411-7b28-4405-b7d2-5f7b40237b51" (UID: "d68bd411-7b28-4405-b7d2-5f7b40237b51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.136500 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.146888 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d68bd411-7b28-4405-b7d2-5f7b40237b51-kube-api-access-dz7g5" (OuterVolumeSpecName: "kube-api-access-dz7g5") pod "d68bd411-7b28-4405-b7d2-5f7b40237b51" (UID: "d68bd411-7b28-4405-b7d2-5f7b40237b51"). InnerVolumeSpecName "kube-api-access-dz7g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.186787 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d68bd411-7b28-4405-b7d2-5f7b40237b51" (UID: "d68bd411-7b28-4405-b7d2-5f7b40237b51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.237714 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz7g5\" (UniqueName: \"kubernetes.io/projected/d68bd411-7b28-4405-b7d2-5f7b40237b51-kube-api-access-dz7g5\") on node \"crc\" DevicePath \"\"" Sep 30 06:04:47 crc kubenswrapper[4956]: I0930 06:04:47.237748 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d68bd411-7b28-4405-b7d2-5f7b40237b51-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.025942 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gxdjr" Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.066512 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gxdjr"] Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.073315 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.073359 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.073398 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.074135 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.074187 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" gracePeriod=600 Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.076550 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gxdjr"] Sep 30 06:04:48 crc kubenswrapper[4956]: E0930 06:04:48.209157 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:04:48 crc kubenswrapper[4956]: I0930 06:04:48.370018 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" path="/var/lib/kubelet/pods/d68bd411-7b28-4405-b7d2-5f7b40237b51/volumes" Sep 30 06:04:49 crc kubenswrapper[4956]: I0930 06:04:49.039453 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" exitCode=0 Sep 30 06:04:49 crc kubenswrapper[4956]: I0930 06:04:49.039520 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894"} Sep 30 06:04:49 crc kubenswrapper[4956]: I0930 06:04:49.039569 4956 scope.go:117] "RemoveContainer" containerID="0dfc54b32590324bdbf7360f23fe87e3bb3d8dad1586a2b8c737285b6be9a13a" Sep 30 06:04:49 crc kubenswrapper[4956]: I0930 06:04:49.040733 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:04:49 crc kubenswrapper[4956]: E0930 06:04:49.041418 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:05:02 crc kubenswrapper[4956]: I0930 06:05:02.341692 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:05:02 crc kubenswrapper[4956]: E0930 06:05:02.343509 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:05:16 crc kubenswrapper[4956]: I0930 06:05:16.343535 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:05:16 crc kubenswrapper[4956]: E0930 06:05:16.344601 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:05:29 crc kubenswrapper[4956]: I0930 06:05:29.341768 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:05:29 crc kubenswrapper[4956]: E0930 06:05:29.342818 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:05:41 crc kubenswrapper[4956]: I0930 06:05:41.341192 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:05:41 crc kubenswrapper[4956]: E0930 06:05:41.342211 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:05:52 crc kubenswrapper[4956]: I0930 06:05:52.340992 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:05:52 crc kubenswrapper[4956]: E0930 06:05:52.341842 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:06:06 crc kubenswrapper[4956]: I0930 06:06:06.341413 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:06:06 crc kubenswrapper[4956]: E0930 06:06:06.342525 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:06:19 crc kubenswrapper[4956]: I0930 06:06:19.341250 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:06:19 crc kubenswrapper[4956]: E0930 06:06:19.342865 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:06:30 crc kubenswrapper[4956]: I0930 06:06:30.349088 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:06:30 crc kubenswrapper[4956]: E0930 06:06:30.349873 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:06:42 crc kubenswrapper[4956]: I0930 06:06:42.341462 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:06:42 crc kubenswrapper[4956]: E0930 06:06:42.342339 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:06:57 crc kubenswrapper[4956]: I0930 06:06:57.341565 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:06:57 crc kubenswrapper[4956]: E0930 06:06:57.342588 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:07:09 crc kubenswrapper[4956]: I0930 06:07:09.341948 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:07:09 crc kubenswrapper[4956]: E0930 06:07:09.342942 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:07:22 crc kubenswrapper[4956]: I0930 06:07:22.341909 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:07:22 crc kubenswrapper[4956]: E0930 06:07:22.343045 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:07:36 crc kubenswrapper[4956]: I0930 06:07:36.342506 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:07:36 crc kubenswrapper[4956]: E0930 06:07:36.344190 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:07:51 crc kubenswrapper[4956]: I0930 06:07:51.341502 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:07:51 crc kubenswrapper[4956]: E0930 06:07:51.342520 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:08:06 crc kubenswrapper[4956]: I0930 06:08:06.341370 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:08:06 crc kubenswrapper[4956]: E0930 06:08:06.342789 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.303320 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8r5nk"] Sep 30 06:08:09 crc kubenswrapper[4956]: E0930 06:08:09.303905 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerName="registry-server" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.303917 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerName="registry-server" Sep 30 06:08:09 crc kubenswrapper[4956]: E0930 06:08:09.303933 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerName="extract-utilities" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.303939 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerName="extract-utilities" Sep 30 06:08:09 crc kubenswrapper[4956]: E0930 06:08:09.303952 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerName="extract-content" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.303958 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerName="extract-content" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.304173 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d68bd411-7b28-4405-b7d2-5f7b40237b51" containerName="registry-server" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.305558 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.333434 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8r5nk"] Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.402485 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-catalog-content\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.402765 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpsht\" (UniqueName: \"kubernetes.io/projected/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-kube-api-access-mpsht\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.403143 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-utilities\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.504678 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-utilities\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.504822 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-catalog-content\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.504873 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpsht\" (UniqueName: \"kubernetes.io/projected/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-kube-api-access-mpsht\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.505392 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-utilities\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.505508 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-catalog-content\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.538460 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpsht\" (UniqueName: \"kubernetes.io/projected/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-kube-api-access-mpsht\") pod \"redhat-marketplace-8r5nk\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:09 crc kubenswrapper[4956]: I0930 06:08:09.646580 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:10 crc kubenswrapper[4956]: I0930 06:08:10.165594 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8r5nk"] Sep 30 06:08:10 crc kubenswrapper[4956]: I0930 06:08:10.356102 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8r5nk" event={"ID":"47ba6146-dfd3-44f9-b9c8-229b1ff1553b","Type":"ContainerStarted","Data":"4483bf4faf079490d84df521087f547e0a2538c9ca100ee1a56e555349965842"} Sep 30 06:08:11 crc kubenswrapper[4956]: I0930 06:08:11.352339 4956 generic.go:334] "Generic (PLEG): container finished" podID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerID="5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf" exitCode=0 Sep 30 06:08:11 crc kubenswrapper[4956]: I0930 06:08:11.352417 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8r5nk" event={"ID":"47ba6146-dfd3-44f9-b9c8-229b1ff1553b","Type":"ContainerDied","Data":"5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf"} Sep 30 06:08:11 crc kubenswrapper[4956]: I0930 06:08:11.355103 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:08:12 crc kubenswrapper[4956]: I0930 06:08:12.428038 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8r5nk" event={"ID":"47ba6146-dfd3-44f9-b9c8-229b1ff1553b","Type":"ContainerStarted","Data":"0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884"} Sep 30 06:08:13 crc kubenswrapper[4956]: I0930 06:08:13.439804 4956 generic.go:334] "Generic (PLEG): container finished" podID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerID="0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884" exitCode=0 Sep 30 06:08:13 crc kubenswrapper[4956]: I0930 06:08:13.439917 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8r5nk" event={"ID":"47ba6146-dfd3-44f9-b9c8-229b1ff1553b","Type":"ContainerDied","Data":"0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884"} Sep 30 06:08:13 crc kubenswrapper[4956]: I0930 06:08:13.440290 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8r5nk" event={"ID":"47ba6146-dfd3-44f9-b9c8-229b1ff1553b","Type":"ContainerStarted","Data":"83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b"} Sep 30 06:08:19 crc kubenswrapper[4956]: I0930 06:08:19.341593 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:08:19 crc kubenswrapper[4956]: E0930 06:08:19.342355 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:08:19 crc kubenswrapper[4956]: I0930 06:08:19.647413 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:19 crc kubenswrapper[4956]: I0930 06:08:19.647470 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:19 crc kubenswrapper[4956]: I0930 06:08:19.738755 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:19 crc kubenswrapper[4956]: I0930 06:08:19.800323 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8r5nk" podStartSLOduration=9.205932899 podStartE2EDuration="10.800304191s" podCreationTimestamp="2025-09-30 06:08:09 +0000 UTC" firstStartedPulling="2025-09-30 06:08:11.354826688 +0000 UTC m=+2361.681947223" lastFinishedPulling="2025-09-30 06:08:12.94919795 +0000 UTC m=+2363.276318515" observedRunningTime="2025-09-30 06:08:13.470660454 +0000 UTC m=+2363.797780989" watchObservedRunningTime="2025-09-30 06:08:19.800304191 +0000 UTC m=+2370.127424716" Sep 30 06:08:20 crc kubenswrapper[4956]: I0930 06:08:20.596521 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:20 crc kubenswrapper[4956]: I0930 06:08:20.664235 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8r5nk"] Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.403140 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qhk9w"] Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.415232 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhk9w"] Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.416048 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.483445 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-utilities\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.483513 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gbjl\" (UniqueName: \"kubernetes.io/projected/43789577-1762-4e7e-ac55-bcc788d0ac8b-kube-api-access-7gbjl\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.483621 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-catalog-content\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.546620 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8r5nk" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerName="registry-server" containerID="cri-o://83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b" gracePeriod=2 Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.585578 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-utilities\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.585651 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gbjl\" (UniqueName: \"kubernetes.io/projected/43789577-1762-4e7e-ac55-bcc788d0ac8b-kube-api-access-7gbjl\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.585978 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-catalog-content\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.586700 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-catalog-content\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.586714 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-utilities\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.614042 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gbjl\" (UniqueName: \"kubernetes.io/projected/43789577-1762-4e7e-ac55-bcc788d0ac8b-kube-api-access-7gbjl\") pod \"redhat-operators-qhk9w\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:22 crc kubenswrapper[4956]: I0930 06:08:22.749748 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.029275 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.104715 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-catalog-content\") pod \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.104768 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpsht\" (UniqueName: \"kubernetes.io/projected/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-kube-api-access-mpsht\") pod \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.104846 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-utilities\") pod \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\" (UID: \"47ba6146-dfd3-44f9-b9c8-229b1ff1553b\") " Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.109629 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-utilities" (OuterVolumeSpecName: "utilities") pod "47ba6146-dfd3-44f9-b9c8-229b1ff1553b" (UID: "47ba6146-dfd3-44f9-b9c8-229b1ff1553b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.116358 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-kube-api-access-mpsht" (OuterVolumeSpecName: "kube-api-access-mpsht") pod "47ba6146-dfd3-44f9-b9c8-229b1ff1553b" (UID: "47ba6146-dfd3-44f9-b9c8-229b1ff1553b"). InnerVolumeSpecName "kube-api-access-mpsht". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.122829 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47ba6146-dfd3-44f9-b9c8-229b1ff1553b" (UID: "47ba6146-dfd3-44f9-b9c8-229b1ff1553b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.207215 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.207253 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpsht\" (UniqueName: \"kubernetes.io/projected/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-kube-api-access-mpsht\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.207268 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ba6146-dfd3-44f9-b9c8-229b1ff1553b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.382258 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhk9w"] Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.556766 4956 generic.go:334] "Generic (PLEG): container finished" podID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerID="83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b" exitCode=0 Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.556814 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8r5nk" event={"ID":"47ba6146-dfd3-44f9-b9c8-229b1ff1553b","Type":"ContainerDied","Data":"83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b"} Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.557299 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8r5nk" event={"ID":"47ba6146-dfd3-44f9-b9c8-229b1ff1553b","Type":"ContainerDied","Data":"4483bf4faf079490d84df521087f547e0a2538c9ca100ee1a56e555349965842"} Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.557323 4956 scope.go:117] "RemoveContainer" containerID="83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.556836 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8r5nk" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.560580 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhk9w" event={"ID":"43789577-1762-4e7e-ac55-bcc788d0ac8b","Type":"ContainerStarted","Data":"0522c67fede25e6a4daf57bf7d1a94f357ce4be1c63fd4ea9e0020e55d97df9b"} Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.584876 4956 scope.go:117] "RemoveContainer" containerID="0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.599723 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8r5nk"] Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.611551 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8r5nk"] Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.650585 4956 scope.go:117] "RemoveContainer" containerID="5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.714527 4956 scope.go:117] "RemoveContainer" containerID="83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b" Sep 30 06:08:23 crc kubenswrapper[4956]: E0930 06:08:23.715180 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b\": container with ID starting with 83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b not found: ID does not exist" containerID="83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.715222 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b"} err="failed to get container status \"83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b\": rpc error: code = NotFound desc = could not find container \"83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b\": container with ID starting with 83eeb0e193f16e003752b6116511f7d72924c12fbcd5bf1d376bc8b559c16d8b not found: ID does not exist" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.715251 4956 scope.go:117] "RemoveContainer" containerID="0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884" Sep 30 06:08:23 crc kubenswrapper[4956]: E0930 06:08:23.715638 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884\": container with ID starting with 0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884 not found: ID does not exist" containerID="0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.715673 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884"} err="failed to get container status \"0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884\": rpc error: code = NotFound desc = could not find container \"0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884\": container with ID starting with 0e284a883f0be0d129e424418d4f595a0d8fb01b0f2a964464b32405fdf30884 not found: ID does not exist" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.715726 4956 scope.go:117] "RemoveContainer" containerID="5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf" Sep 30 06:08:23 crc kubenswrapper[4956]: E0930 06:08:23.715974 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf\": container with ID starting with 5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf not found: ID does not exist" containerID="5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf" Sep 30 06:08:23 crc kubenswrapper[4956]: I0930 06:08:23.715996 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf"} err="failed to get container status \"5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf\": rpc error: code = NotFound desc = could not find container \"5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf\": container with ID starting with 5f97238964abdd317710759fe00af951092a1b52deabbd9feb26b92735a1cfcf not found: ID does not exist" Sep 30 06:08:24 crc kubenswrapper[4956]: I0930 06:08:24.353168 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" path="/var/lib/kubelet/pods/47ba6146-dfd3-44f9-b9c8-229b1ff1553b/volumes" Sep 30 06:08:24 crc kubenswrapper[4956]: I0930 06:08:24.576225 4956 generic.go:334] "Generic (PLEG): container finished" podID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerID="b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a" exitCode=0 Sep 30 06:08:24 crc kubenswrapper[4956]: I0930 06:08:24.576282 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhk9w" event={"ID":"43789577-1762-4e7e-ac55-bcc788d0ac8b","Type":"ContainerDied","Data":"b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a"} Sep 30 06:08:26 crc kubenswrapper[4956]: I0930 06:08:26.596222 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhk9w" event={"ID":"43789577-1762-4e7e-ac55-bcc788d0ac8b","Type":"ContainerStarted","Data":"26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7"} Sep 30 06:08:27 crc kubenswrapper[4956]: I0930 06:08:27.627224 4956 generic.go:334] "Generic (PLEG): container finished" podID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerID="26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7" exitCode=0 Sep 30 06:08:27 crc kubenswrapper[4956]: I0930 06:08:27.627279 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhk9w" event={"ID":"43789577-1762-4e7e-ac55-bcc788d0ac8b","Type":"ContainerDied","Data":"26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7"} Sep 30 06:08:28 crc kubenswrapper[4956]: I0930 06:08:28.650102 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhk9w" event={"ID":"43789577-1762-4e7e-ac55-bcc788d0ac8b","Type":"ContainerStarted","Data":"02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746"} Sep 30 06:08:28 crc kubenswrapper[4956]: I0930 06:08:28.670172 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qhk9w" podStartSLOduration=3.098756605 podStartE2EDuration="6.670155486s" podCreationTimestamp="2025-09-30 06:08:22 +0000 UTC" firstStartedPulling="2025-09-30 06:08:24.579752079 +0000 UTC m=+2374.906872604" lastFinishedPulling="2025-09-30 06:08:28.15115096 +0000 UTC m=+2378.478271485" observedRunningTime="2025-09-30 06:08:28.666377189 +0000 UTC m=+2378.993497744" watchObservedRunningTime="2025-09-30 06:08:28.670155486 +0000 UTC m=+2378.997276021" Sep 30 06:08:32 crc kubenswrapper[4956]: I0930 06:08:32.750168 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:32 crc kubenswrapper[4956]: I0930 06:08:32.750590 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:32 crc kubenswrapper[4956]: I0930 06:08:32.798038 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:33 crc kubenswrapper[4956]: I0930 06:08:33.340660 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:08:33 crc kubenswrapper[4956]: E0930 06:08:33.340905 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:08:33 crc kubenswrapper[4956]: I0930 06:08:33.758675 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:33 crc kubenswrapper[4956]: I0930 06:08:33.806265 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qhk9w"] Sep 30 06:08:35 crc kubenswrapper[4956]: I0930 06:08:35.734771 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qhk9w" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerName="registry-server" containerID="cri-o://02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746" gracePeriod=2 Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.229754 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.314392 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gbjl\" (UniqueName: \"kubernetes.io/projected/43789577-1762-4e7e-ac55-bcc788d0ac8b-kube-api-access-7gbjl\") pod \"43789577-1762-4e7e-ac55-bcc788d0ac8b\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.314510 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-catalog-content\") pod \"43789577-1762-4e7e-ac55-bcc788d0ac8b\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.314647 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-utilities\") pod \"43789577-1762-4e7e-ac55-bcc788d0ac8b\" (UID: \"43789577-1762-4e7e-ac55-bcc788d0ac8b\") " Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.315713 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-utilities" (OuterVolumeSpecName: "utilities") pod "43789577-1762-4e7e-ac55-bcc788d0ac8b" (UID: "43789577-1762-4e7e-ac55-bcc788d0ac8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.321287 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43789577-1762-4e7e-ac55-bcc788d0ac8b-kube-api-access-7gbjl" (OuterVolumeSpecName: "kube-api-access-7gbjl") pod "43789577-1762-4e7e-ac55-bcc788d0ac8b" (UID: "43789577-1762-4e7e-ac55-bcc788d0ac8b"). InnerVolumeSpecName "kube-api-access-7gbjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.416490 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.416523 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gbjl\" (UniqueName: \"kubernetes.io/projected/43789577-1762-4e7e-ac55-bcc788d0ac8b-kube-api-access-7gbjl\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.744929 4956 generic.go:334] "Generic (PLEG): container finished" podID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerID="02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746" exitCode=0 Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.745015 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhk9w" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.745061 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhk9w" event={"ID":"43789577-1762-4e7e-ac55-bcc788d0ac8b","Type":"ContainerDied","Data":"02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746"} Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.745091 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhk9w" event={"ID":"43789577-1762-4e7e-ac55-bcc788d0ac8b","Type":"ContainerDied","Data":"0522c67fede25e6a4daf57bf7d1a94f357ce4be1c63fd4ea9e0020e55d97df9b"} Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.745125 4956 scope.go:117] "RemoveContainer" containerID="02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.747162 4956 generic.go:334] "Generic (PLEG): container finished" podID="16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" containerID="ec7f82e9d856e6dfb313ed68a13dad6857d80b75193134327b9841b80a98296c" exitCode=0 Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.747508 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" event={"ID":"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4","Type":"ContainerDied","Data":"ec7f82e9d856e6dfb313ed68a13dad6857d80b75193134327b9841b80a98296c"} Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.766090 4956 scope.go:117] "RemoveContainer" containerID="26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.789901 4956 scope.go:117] "RemoveContainer" containerID="b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.853305 4956 scope.go:117] "RemoveContainer" containerID="02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746" Sep 30 06:08:36 crc kubenswrapper[4956]: E0930 06:08:36.853942 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746\": container with ID starting with 02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746 not found: ID does not exist" containerID="02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.853978 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746"} err="failed to get container status \"02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746\": rpc error: code = NotFound desc = could not find container \"02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746\": container with ID starting with 02d9b76180ac0d280c971b655c0a8df50f848d3b989602039d4a836da3886746 not found: ID does not exist" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.854007 4956 scope.go:117] "RemoveContainer" containerID="26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7" Sep 30 06:08:36 crc kubenswrapper[4956]: E0930 06:08:36.854455 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7\": container with ID starting with 26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7 not found: ID does not exist" containerID="26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.854485 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7"} err="failed to get container status \"26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7\": rpc error: code = NotFound desc = could not find container \"26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7\": container with ID starting with 26322328a4b691cb28f189a872ffc77f5d98926ca25bce1c02ce4c3fb2883ec7 not found: ID does not exist" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.854505 4956 scope.go:117] "RemoveContainer" containerID="b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a" Sep 30 06:08:36 crc kubenswrapper[4956]: E0930 06:08:36.854838 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a\": container with ID starting with b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a not found: ID does not exist" containerID="b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a" Sep 30 06:08:36 crc kubenswrapper[4956]: I0930 06:08:36.854868 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a"} err="failed to get container status \"b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a\": rpc error: code = NotFound desc = could not find container \"b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a\": container with ID starting with b9128abb962765af6400cd4c1a042e3a51c2bc5978a07ab23cb76229c4f91a6a not found: ID does not exist" Sep 30 06:08:37 crc kubenswrapper[4956]: I0930 06:08:37.475916 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43789577-1762-4e7e-ac55-bcc788d0ac8b" (UID: "43789577-1762-4e7e-ac55-bcc788d0ac8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:08:37 crc kubenswrapper[4956]: I0930 06:08:37.538041 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789577-1762-4e7e-ac55-bcc788d0ac8b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:37 crc kubenswrapper[4956]: I0930 06:08:37.690385 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qhk9w"] Sep 30 06:08:37 crc kubenswrapper[4956]: I0930 06:08:37.704705 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qhk9w"] Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.186164 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.248829 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpt2k\" (UniqueName: \"kubernetes.io/projected/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-kube-api-access-vpt2k\") pod \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.249277 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-secret-0\") pod \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.249342 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-combined-ca-bundle\") pod \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.249373 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-ssh-key\") pod \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.249427 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-inventory\") pod \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\" (UID: \"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4\") " Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.269219 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" (UID: "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.272230 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-kube-api-access-vpt2k" (OuterVolumeSpecName: "kube-api-access-vpt2k") pod "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" (UID: "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4"). InnerVolumeSpecName "kube-api-access-vpt2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.288921 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" (UID: "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.289161 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" (UID: "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.306813 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-inventory" (OuterVolumeSpecName: "inventory") pod "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" (UID: "16fa92a8-7fcd-45bd-9b5a-f77149ec71f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.351668 4956 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.351707 4956 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.351723 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.351734 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.351746 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpt2k\" (UniqueName: \"kubernetes.io/projected/16fa92a8-7fcd-45bd-9b5a-f77149ec71f4-kube-api-access-vpt2k\") on node \"crc\" DevicePath \"\"" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.357067 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" path="/var/lib/kubelet/pods/43789577-1762-4e7e-ac55-bcc788d0ac8b/volumes" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.785586 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" event={"ID":"16fa92a8-7fcd-45bd-9b5a-f77149ec71f4","Type":"ContainerDied","Data":"e467722f0d6a98213783b7b14bd6580da1a45bff48f315e6b1b569d7f1f8e8e7"} Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.785939 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e467722f0d6a98213783b7b14bd6580da1a45bff48f315e6b1b569d7f1f8e8e7" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.786145 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-scntk" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.878675 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr"] Sep 30 06:08:38 crc kubenswrapper[4956]: E0930 06:08:38.879102 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerName="extract-content" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879143 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerName="extract-content" Sep 30 06:08:38 crc kubenswrapper[4956]: E0930 06:08:38.879166 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerName="registry-server" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879176 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerName="registry-server" Sep 30 06:08:38 crc kubenswrapper[4956]: E0930 06:08:38.879200 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerName="registry-server" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879212 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerName="registry-server" Sep 30 06:08:38 crc kubenswrapper[4956]: E0930 06:08:38.879248 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerName="extract-content" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879258 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerName="extract-content" Sep 30 06:08:38 crc kubenswrapper[4956]: E0930 06:08:38.879275 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879288 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 06:08:38 crc kubenswrapper[4956]: E0930 06:08:38.879315 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerName="extract-utilities" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879327 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerName="extract-utilities" Sep 30 06:08:38 crc kubenswrapper[4956]: E0930 06:08:38.879345 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerName="extract-utilities" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879353 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerName="extract-utilities" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879583 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ba6146-dfd3-44f9-b9c8-229b1ff1553b" containerName="registry-server" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879600 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fa92a8-7fcd-45bd-9b5a-f77149ec71f4" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.879624 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="43789577-1762-4e7e-ac55-bcc788d0ac8b" containerName="registry-server" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.880485 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.887961 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.888437 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.888487 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.888583 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.888972 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.888998 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.889091 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.901324 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr"] Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963021 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963085 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963136 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963157 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963175 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963208 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963257 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5slq\" (UniqueName: \"kubernetes.io/projected/8c77505e-cdca-4f43-a276-2102b2c33a58-kube-api-access-f5slq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963287 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:38 crc kubenswrapper[4956]: I0930 06:08:38.963306 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065034 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5slq\" (UniqueName: \"kubernetes.io/projected/8c77505e-cdca-4f43-a276-2102b2c33a58-kube-api-access-f5slq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065102 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065136 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065208 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065245 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065273 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065291 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065308 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.065338 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.067345 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.069454 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.069851 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.069982 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.070258 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.071372 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.071819 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.072059 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.084049 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5slq\" (UniqueName: \"kubernetes.io/projected/8c77505e-cdca-4f43-a276-2102b2c33a58-kube-api-access-f5slq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4vfr\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.198720 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:08:39 crc kubenswrapper[4956]: I0930 06:08:39.801377 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr"] Sep 30 06:08:39 crc kubenswrapper[4956]: W0930 06:08:39.815493 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c77505e_cdca_4f43_a276_2102b2c33a58.slice/crio-59944ed51a519c1d5b4192fecff9160d5ce87fa5b2e3f07845f2b529e2de452b WatchSource:0}: Error finding container 59944ed51a519c1d5b4192fecff9160d5ce87fa5b2e3f07845f2b529e2de452b: Status 404 returned error can't find the container with id 59944ed51a519c1d5b4192fecff9160d5ce87fa5b2e3f07845f2b529e2de452b Sep 30 06:08:40 crc kubenswrapper[4956]: I0930 06:08:40.805105 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" event={"ID":"8c77505e-cdca-4f43-a276-2102b2c33a58","Type":"ContainerStarted","Data":"9aca5aa59529d1629b9b494cf3892f6cdff6a3dddfa18875620bd27cda4e3193"} Sep 30 06:08:40 crc kubenswrapper[4956]: I0930 06:08:40.805450 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" event={"ID":"8c77505e-cdca-4f43-a276-2102b2c33a58","Type":"ContainerStarted","Data":"59944ed51a519c1d5b4192fecff9160d5ce87fa5b2e3f07845f2b529e2de452b"} Sep 30 06:08:40 crc kubenswrapper[4956]: I0930 06:08:40.829621 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" podStartSLOduration=2.309233972 podStartE2EDuration="2.829604442s" podCreationTimestamp="2025-09-30 06:08:38 +0000 UTC" firstStartedPulling="2025-09-30 06:08:39.817374875 +0000 UTC m=+2390.144495400" lastFinishedPulling="2025-09-30 06:08:40.337745345 +0000 UTC m=+2390.664865870" observedRunningTime="2025-09-30 06:08:40.82537679 +0000 UTC m=+2391.152497315" watchObservedRunningTime="2025-09-30 06:08:40.829604442 +0000 UTC m=+2391.156724967" Sep 30 06:08:47 crc kubenswrapper[4956]: I0930 06:08:47.340934 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:08:47 crc kubenswrapper[4956]: E0930 06:08:47.341953 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:08:58 crc kubenswrapper[4956]: I0930 06:08:58.342373 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:08:58 crc kubenswrapper[4956]: E0930 06:08:58.343754 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:09:12 crc kubenswrapper[4956]: I0930 06:09:12.341802 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:09:12 crc kubenswrapper[4956]: E0930 06:09:12.345290 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:09:26 crc kubenswrapper[4956]: I0930 06:09:26.341742 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:09:26 crc kubenswrapper[4956]: E0930 06:09:26.342890 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:09:40 crc kubenswrapper[4956]: I0930 06:09:40.349294 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:09:40 crc kubenswrapper[4956]: E0930 06:09:40.350024 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:09:54 crc kubenswrapper[4956]: I0930 06:09:54.341773 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:09:54 crc kubenswrapper[4956]: I0930 06:09:54.642003 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"4dc15bfc3f84595554d9630720800f9c8292e405d7c3920824417d9ee8c558cc"} Sep 30 06:11:14 crc kubenswrapper[4956]: I0930 06:11:14.170798 4956 scope.go:117] "RemoveContainer" containerID="6532f49686669573cd7f457a9e7c891882c2f146fbbb67e014b824f73a837510" Sep 30 06:11:14 crc kubenswrapper[4956]: I0930 06:11:14.222612 4956 scope.go:117] "RemoveContainer" containerID="46a5b699e8f772ae6ae7905c908e1e9b2409f6b313fb1ffc760d748b599abb92" Sep 30 06:11:14 crc kubenswrapper[4956]: I0930 06:11:14.293095 4956 scope.go:117] "RemoveContainer" containerID="1809ff8c1a5be636962b5285f3fe124995c7753a15e0b287d36ae71f802936a9" Sep 30 06:12:18 crc kubenswrapper[4956]: I0930 06:12:18.073646 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:12:18 crc kubenswrapper[4956]: I0930 06:12:18.074267 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:12:36 crc kubenswrapper[4956]: I0930 06:12:36.412662 4956 generic.go:334] "Generic (PLEG): container finished" podID="8c77505e-cdca-4f43-a276-2102b2c33a58" containerID="9aca5aa59529d1629b9b494cf3892f6cdff6a3dddfa18875620bd27cda4e3193" exitCode=0 Sep 30 06:12:36 crc kubenswrapper[4956]: I0930 06:12:36.412742 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" event={"ID":"8c77505e-cdca-4f43-a276-2102b2c33a58","Type":"ContainerDied","Data":"9aca5aa59529d1629b9b494cf3892f6cdff6a3dddfa18875620bd27cda4e3193"} Sep 30 06:12:37 crc kubenswrapper[4956]: I0930 06:12:37.948633 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.139832 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-1\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.139933 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5slq\" (UniqueName: \"kubernetes.io/projected/8c77505e-cdca-4f43-a276-2102b2c33a58-kube-api-access-f5slq\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.139970 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-0\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.140007 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-ssh-key\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.140056 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-0\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.140083 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-1\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.140135 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-combined-ca-bundle\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.140171 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-extra-config-0\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.140235 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-inventory\") pod \"8c77505e-cdca-4f43-a276-2102b2c33a58\" (UID: \"8c77505e-cdca-4f43-a276-2102b2c33a58\") " Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.145867 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c77505e-cdca-4f43-a276-2102b2c33a58-kube-api-access-f5slq" (OuterVolumeSpecName: "kube-api-access-f5slq") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "kube-api-access-f5slq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.156381 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.167819 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-inventory" (OuterVolumeSpecName: "inventory") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.169974 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.177175 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.181979 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.183384 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.187372 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.191101 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "8c77505e-cdca-4f43-a276-2102b2c33a58" (UID: "8c77505e-cdca-4f43-a276-2102b2c33a58"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242083 4956 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242139 4956 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242154 4956 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242168 4956 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242180 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242192 4956 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242204 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5slq\" (UniqueName: \"kubernetes.io/projected/8c77505e-cdca-4f43-a276-2102b2c33a58-kube-api-access-f5slq\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242216 4956 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.242227 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c77505e-cdca-4f43-a276-2102b2c33a58-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.437469 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" event={"ID":"8c77505e-cdca-4f43-a276-2102b2c33a58","Type":"ContainerDied","Data":"59944ed51a519c1d5b4192fecff9160d5ce87fa5b2e3f07845f2b529e2de452b"} Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.437514 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59944ed51a519c1d5b4192fecff9160d5ce87fa5b2e3f07845f2b529e2de452b" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.437576 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4vfr" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.644214 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c"] Sep 30 06:12:38 crc kubenswrapper[4956]: E0930 06:12:38.645305 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c77505e-cdca-4f43-a276-2102b2c33a58" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.645339 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c77505e-cdca-4f43-a276-2102b2c33a58" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.645667 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c77505e-cdca-4f43-a276-2102b2c33a58" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.646552 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.650046 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.650077 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.650370 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.651674 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c"] Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.653681 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.665864 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-48xmn" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.751313 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.751363 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.751440 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.751490 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.751527 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.751574 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgr9d\" (UniqueName: \"kubernetes.io/projected/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-kube-api-access-zgr9d\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.751597 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.852875 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.852987 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.853028 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.853063 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgr9d\" (UniqueName: \"kubernetes.io/projected/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-kube-api-access-zgr9d\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.853095 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.853223 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.853256 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.857684 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.858418 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.858486 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.858688 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.859249 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.860610 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.871477 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgr9d\" (UniqueName: \"kubernetes.io/projected/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-kube-api-access-zgr9d\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:38 crc kubenswrapper[4956]: I0930 06:12:38.966710 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:12:40 crc kubenswrapper[4956]: I0930 06:12:40.007024 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c"] Sep 30 06:12:40 crc kubenswrapper[4956]: I0930 06:12:40.462075 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" event={"ID":"67054ea3-1f2b-43dd-ada6-8908e9f7c2de","Type":"ContainerStarted","Data":"d72580c9a2f03bbbb7a16d3822899c310d7cca80908d10e52530c821368adf18"} Sep 30 06:12:41 crc kubenswrapper[4956]: I0930 06:12:41.484843 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" event={"ID":"67054ea3-1f2b-43dd-ada6-8908e9f7c2de","Type":"ContainerStarted","Data":"5b3c90134d67a73f4a4ac88c5e41c8241511b51bd05d4575a3694c454952b6a1"} Sep 30 06:12:41 crc kubenswrapper[4956]: I0930 06:12:41.525040 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" podStartSLOduration=2.929062796 podStartE2EDuration="3.525018407s" podCreationTimestamp="2025-09-30 06:12:38 +0000 UTC" firstStartedPulling="2025-09-30 06:12:40.01039159 +0000 UTC m=+2630.337512115" lastFinishedPulling="2025-09-30 06:12:40.606347181 +0000 UTC m=+2630.933467726" observedRunningTime="2025-09-30 06:12:41.516106127 +0000 UTC m=+2631.843226672" watchObservedRunningTime="2025-09-30 06:12:41.525018407 +0000 UTC m=+2631.852138922" Sep 30 06:12:48 crc kubenswrapper[4956]: I0930 06:12:48.074039 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:12:48 crc kubenswrapper[4956]: I0930 06:12:48.074564 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.073844 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.074617 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.074697 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.076943 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dc15bfc3f84595554d9630720800f9c8292e405d7c3920824417d9ee8c558cc"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.077173 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://4dc15bfc3f84595554d9630720800f9c8292e405d7c3920824417d9ee8c558cc" gracePeriod=600 Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.912592 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="4dc15bfc3f84595554d9630720800f9c8292e405d7c3920824417d9ee8c558cc" exitCode=0 Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.912634 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"4dc15bfc3f84595554d9630720800f9c8292e405d7c3920824417d9ee8c558cc"} Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.913213 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b"} Sep 30 06:13:18 crc kubenswrapper[4956]: I0930 06:13:18.913235 4956 scope.go:117] "RemoveContainer" containerID="9c778e3a71565f6a239ffc1ec6ce6150a25aa27ee6ef46373ca399744196f894" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.175382 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj"] Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.178751 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.181183 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.182646 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.190741 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj"] Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.312882 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d539a2-c896-4f86-a2ba-6609af83673b-config-volume\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.313217 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d4bb\" (UniqueName: \"kubernetes.io/projected/92d539a2-c896-4f86-a2ba-6609af83673b-kube-api-access-4d4bb\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.313249 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d539a2-c896-4f86-a2ba-6609af83673b-secret-volume\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.415808 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d539a2-c896-4f86-a2ba-6609af83673b-config-volume\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.415939 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d4bb\" (UniqueName: \"kubernetes.io/projected/92d539a2-c896-4f86-a2ba-6609af83673b-kube-api-access-4d4bb\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.416005 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d539a2-c896-4f86-a2ba-6609af83673b-secret-volume\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.417277 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d539a2-c896-4f86-a2ba-6609af83673b-config-volume\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.423342 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d539a2-c896-4f86-a2ba-6609af83673b-secret-volume\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.432496 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d4bb\" (UniqueName: \"kubernetes.io/projected/92d539a2-c896-4f86-a2ba-6609af83673b-kube-api-access-4d4bb\") pod \"collect-profiles-29320215-6z7zj\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.513214 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:00 crc kubenswrapper[4956]: I0930 06:15:00.983027 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj"] Sep 30 06:15:01 crc kubenswrapper[4956]: I0930 06:15:01.087988 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" event={"ID":"92d539a2-c896-4f86-a2ba-6609af83673b","Type":"ContainerStarted","Data":"4d8bd55fc75f6d75cf96e850076b3c43f6d956b0524085e9ee9d52abc53da4fa"} Sep 30 06:15:02 crc kubenswrapper[4956]: I0930 06:15:02.104898 4956 generic.go:334] "Generic (PLEG): container finished" podID="92d539a2-c896-4f86-a2ba-6609af83673b" containerID="4c185e26652bc5538cd47def6201cab4721e3feffd31072266428d96db08c111" exitCode=0 Sep 30 06:15:02 crc kubenswrapper[4956]: I0930 06:15:02.104987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" event={"ID":"92d539a2-c896-4f86-a2ba-6609af83673b","Type":"ContainerDied","Data":"4c185e26652bc5538cd47def6201cab4721e3feffd31072266428d96db08c111"} Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.429138 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.601380 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d539a2-c896-4f86-a2ba-6609af83673b-secret-volume\") pod \"92d539a2-c896-4f86-a2ba-6609af83673b\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.601467 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d539a2-c896-4f86-a2ba-6609af83673b-config-volume\") pod \"92d539a2-c896-4f86-a2ba-6609af83673b\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.601552 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4bb\" (UniqueName: \"kubernetes.io/projected/92d539a2-c896-4f86-a2ba-6609af83673b-kube-api-access-4d4bb\") pod \"92d539a2-c896-4f86-a2ba-6609af83673b\" (UID: \"92d539a2-c896-4f86-a2ba-6609af83673b\") " Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.602004 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d539a2-c896-4f86-a2ba-6609af83673b-config-volume" (OuterVolumeSpecName: "config-volume") pod "92d539a2-c896-4f86-a2ba-6609af83673b" (UID: "92d539a2-c896-4f86-a2ba-6609af83673b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.607407 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d539a2-c896-4f86-a2ba-6609af83673b-kube-api-access-4d4bb" (OuterVolumeSpecName: "kube-api-access-4d4bb") pod "92d539a2-c896-4f86-a2ba-6609af83673b" (UID: "92d539a2-c896-4f86-a2ba-6609af83673b"). InnerVolumeSpecName "kube-api-access-4d4bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.608669 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d539a2-c896-4f86-a2ba-6609af83673b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "92d539a2-c896-4f86-a2ba-6609af83673b" (UID: "92d539a2-c896-4f86-a2ba-6609af83673b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.704437 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4bb\" (UniqueName: \"kubernetes.io/projected/92d539a2-c896-4f86-a2ba-6609af83673b-kube-api-access-4d4bb\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.704504 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92d539a2-c896-4f86-a2ba-6609af83673b-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:03 crc kubenswrapper[4956]: I0930 06:15:03.704523 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92d539a2-c896-4f86-a2ba-6609af83673b-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:04 crc kubenswrapper[4956]: I0930 06:15:04.125936 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" event={"ID":"92d539a2-c896-4f86-a2ba-6609af83673b","Type":"ContainerDied","Data":"4d8bd55fc75f6d75cf96e850076b3c43f6d956b0524085e9ee9d52abc53da4fa"} Sep 30 06:15:04 crc kubenswrapper[4956]: I0930 06:15:04.125977 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d8bd55fc75f6d75cf96e850076b3c43f6d956b0524085e9ee9d52abc53da4fa" Sep 30 06:15:04 crc kubenswrapper[4956]: I0930 06:15:04.126375 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj" Sep 30 06:15:04 crc kubenswrapper[4956]: I0930 06:15:04.506226 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht"] Sep 30 06:15:04 crc kubenswrapper[4956]: I0930 06:15:04.514875 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320170-h7hht"] Sep 30 06:15:06 crc kubenswrapper[4956]: I0930 06:15:06.358962 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140b2851-bf05-4ec3-87db-657aacefdbd4" path="/var/lib/kubelet/pods/140b2851-bf05-4ec3-87db-657aacefdbd4/volumes" Sep 30 06:15:14 crc kubenswrapper[4956]: I0930 06:15:14.410644 4956 scope.go:117] "RemoveContainer" containerID="a8158496e72dde154eee6af181c09d157c45adbc5671ed3ffee47dfc313fcd29" Sep 30 06:15:18 crc kubenswrapper[4956]: I0930 06:15:18.073641 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:15:18 crc kubenswrapper[4956]: I0930 06:15:18.074291 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:15:21 crc kubenswrapper[4956]: I0930 06:15:21.297597 4956 generic.go:334] "Generic (PLEG): container finished" podID="67054ea3-1f2b-43dd-ada6-8908e9f7c2de" containerID="5b3c90134d67a73f4a4ac88c5e41c8241511b51bd05d4575a3694c454952b6a1" exitCode=0 Sep 30 06:15:21 crc kubenswrapper[4956]: I0930 06:15:21.297757 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" event={"ID":"67054ea3-1f2b-43dd-ada6-8908e9f7c2de","Type":"ContainerDied","Data":"5b3c90134d67a73f4a4ac88c5e41c8241511b51bd05d4575a3694c454952b6a1"} Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.751817 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.906937 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ssh-key\") pod \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.907426 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgr9d\" (UniqueName: \"kubernetes.io/projected/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-kube-api-access-zgr9d\") pod \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.907488 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-1\") pod \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.907536 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-0\") pod \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.907563 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-telemetry-combined-ca-bundle\") pod \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.907625 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-inventory\") pod \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.907650 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-2\") pod \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\" (UID: \"67054ea3-1f2b-43dd-ada6-8908e9f7c2de\") " Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.914084 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "67054ea3-1f2b-43dd-ada6-8908e9f7c2de" (UID: "67054ea3-1f2b-43dd-ada6-8908e9f7c2de"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.914751 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-kube-api-access-zgr9d" (OuterVolumeSpecName: "kube-api-access-zgr9d") pod "67054ea3-1f2b-43dd-ada6-8908e9f7c2de" (UID: "67054ea3-1f2b-43dd-ada6-8908e9f7c2de"). InnerVolumeSpecName "kube-api-access-zgr9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.942303 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "67054ea3-1f2b-43dd-ada6-8908e9f7c2de" (UID: "67054ea3-1f2b-43dd-ada6-8908e9f7c2de"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.943404 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "67054ea3-1f2b-43dd-ada6-8908e9f7c2de" (UID: "67054ea3-1f2b-43dd-ada6-8908e9f7c2de"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.947559 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-inventory" (OuterVolumeSpecName: "inventory") pod "67054ea3-1f2b-43dd-ada6-8908e9f7c2de" (UID: "67054ea3-1f2b-43dd-ada6-8908e9f7c2de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.970868 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "67054ea3-1f2b-43dd-ada6-8908e9f7c2de" (UID: "67054ea3-1f2b-43dd-ada6-8908e9f7c2de"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:15:22 crc kubenswrapper[4956]: I0930 06:15:22.971547 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "67054ea3-1f2b-43dd-ada6-8908e9f7c2de" (UID: "67054ea3-1f2b-43dd-ada6-8908e9f7c2de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.009971 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgr9d\" (UniqueName: \"kubernetes.io/projected/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-kube-api-access-zgr9d\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.010012 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.010026 4956 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.010037 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.010050 4956 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.010061 4956 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.010071 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67054ea3-1f2b-43dd-ada6-8908e9f7c2de-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.318760 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" event={"ID":"67054ea3-1f2b-43dd-ada6-8908e9f7c2de","Type":"ContainerDied","Data":"d72580c9a2f03bbbb7a16d3822899c310d7cca80908d10e52530c821368adf18"} Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.318801 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d72580c9a2f03bbbb7a16d3822899c310d7cca80908d10e52530c821368adf18" Sep 30 06:15:23 crc kubenswrapper[4956]: I0930 06:15:23.318853 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c" Sep 30 06:15:48 crc kubenswrapper[4956]: I0930 06:15:48.073447 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:15:48 crc kubenswrapper[4956]: I0930 06:15:48.074154 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.697795 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s8vz6"] Sep 30 06:15:55 crc kubenswrapper[4956]: E0930 06:15:55.698819 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67054ea3-1f2b-43dd-ada6-8908e9f7c2de" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.698834 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="67054ea3-1f2b-43dd-ada6-8908e9f7c2de" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 06:15:55 crc kubenswrapper[4956]: E0930 06:15:55.698876 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d539a2-c896-4f86-a2ba-6609af83673b" containerName="collect-profiles" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.698883 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d539a2-c896-4f86-a2ba-6609af83673b" containerName="collect-profiles" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.699101 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="67054ea3-1f2b-43dd-ada6-8908e9f7c2de" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.699140 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d539a2-c896-4f86-a2ba-6609af83673b" containerName="collect-profiles" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.700944 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.714307 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8vz6"] Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.883108 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-utilities\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.883284 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdlnz\" (UniqueName: \"kubernetes.io/projected/9159453a-2fc2-4806-b7b3-10a148f2326d-kube-api-access-rdlnz\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.883309 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-catalog-content\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.984859 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdlnz\" (UniqueName: \"kubernetes.io/projected/9159453a-2fc2-4806-b7b3-10a148f2326d-kube-api-access-rdlnz\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.984906 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-catalog-content\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.984980 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-utilities\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.985744 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-catalog-content\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:55 crc kubenswrapper[4956]: I0930 06:15:55.985754 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-utilities\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:56 crc kubenswrapper[4956]: I0930 06:15:56.007032 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdlnz\" (UniqueName: \"kubernetes.io/projected/9159453a-2fc2-4806-b7b3-10a148f2326d-kube-api-access-rdlnz\") pod \"certified-operators-s8vz6\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:56 crc kubenswrapper[4956]: I0930 06:15:56.022753 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:15:56 crc kubenswrapper[4956]: I0930 06:15:56.539097 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8vz6"] Sep 30 06:15:56 crc kubenswrapper[4956]: I0930 06:15:56.676593 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8vz6" event={"ID":"9159453a-2fc2-4806-b7b3-10a148f2326d","Type":"ContainerStarted","Data":"92d70675a53934cf5b3c3e82733ccf2c764b7370426888f409b3042d310f16dc"} Sep 30 06:15:57 crc kubenswrapper[4956]: I0930 06:15:57.691522 4956 generic.go:334] "Generic (PLEG): container finished" podID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerID="36318d79015882bda46e4ab030b5664c2a584fadd06c061dc72fcb6fd26c50f6" exitCode=0 Sep 30 06:15:57 crc kubenswrapper[4956]: I0930 06:15:57.691679 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8vz6" event={"ID":"9159453a-2fc2-4806-b7b3-10a148f2326d","Type":"ContainerDied","Data":"36318d79015882bda46e4ab030b5664c2a584fadd06c061dc72fcb6fd26c50f6"} Sep 30 06:15:57 crc kubenswrapper[4956]: I0930 06:15:57.695730 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:15:58 crc kubenswrapper[4956]: I0930 06:15:58.704684 4956 generic.go:334] "Generic (PLEG): container finished" podID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerID="aaccc3921bf5c369a164c93c0fb56a9e3f6f1ed2cbc1da9fd34d143349b587c1" exitCode=0 Sep 30 06:15:58 crc kubenswrapper[4956]: I0930 06:15:58.704743 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8vz6" event={"ID":"9159453a-2fc2-4806-b7b3-10a148f2326d","Type":"ContainerDied","Data":"aaccc3921bf5c369a164c93c0fb56a9e3f6f1ed2cbc1da9fd34d143349b587c1"} Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.114882 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.115395 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="prometheus" containerID="cri-o://19c74458ef26b83bc07dab14c067ef515cc7ce8a2ed066c8d46e4a9a1b05745b" gracePeriod=600 Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.115661 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="thanos-sidecar" containerID="cri-o://97f6c0624224989180f4175a911599b14468c1c98c87cd5acfbfb0bfaeb53233" gracePeriod=600 Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.115718 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="config-reloader" containerID="cri-o://83eb93e1d04ae7cee3cd63476ad6e49e021a57bbe3ff75ebba7a0fddd5c8e5ea" gracePeriod=600 Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.737317 4956 generic.go:334] "Generic (PLEG): container finished" podID="de710cda-e9a2-426f-a617-2ac08ef16386" containerID="97f6c0624224989180f4175a911599b14468c1c98c87cd5acfbfb0bfaeb53233" exitCode=0 Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.737622 4956 generic.go:334] "Generic (PLEG): container finished" podID="de710cda-e9a2-426f-a617-2ac08ef16386" containerID="83eb93e1d04ae7cee3cd63476ad6e49e021a57bbe3ff75ebba7a0fddd5c8e5ea" exitCode=0 Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.737632 4956 generic.go:334] "Generic (PLEG): container finished" podID="de710cda-e9a2-426f-a617-2ac08ef16386" containerID="19c74458ef26b83bc07dab14c067ef515cc7ce8a2ed066c8d46e4a9a1b05745b" exitCode=0 Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.737676 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerDied","Data":"97f6c0624224989180f4175a911599b14468c1c98c87cd5acfbfb0bfaeb53233"} Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.737703 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerDied","Data":"83eb93e1d04ae7cee3cd63476ad6e49e021a57bbe3ff75ebba7a0fddd5c8e5ea"} Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.737715 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerDied","Data":"19c74458ef26b83bc07dab14c067ef515cc7ce8a2ed066c8d46e4a9a1b05745b"} Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.757283 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8vz6" event={"ID":"9159453a-2fc2-4806-b7b3-10a148f2326d","Type":"ContainerStarted","Data":"107f01d09bae69691e00c7f25140d7d914ea30cda11e7de85f9572bcc01e78b2"} Sep 30 06:15:59 crc kubenswrapper[4956]: I0930 06:15:59.779960 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s8vz6" podStartSLOduration=3.405341769 podStartE2EDuration="4.779946608s" podCreationTimestamp="2025-09-30 06:15:55 +0000 UTC" firstStartedPulling="2025-09-30 06:15:57.695181594 +0000 UTC m=+2828.022302129" lastFinishedPulling="2025-09-30 06:15:59.069786443 +0000 UTC m=+2829.396906968" observedRunningTime="2025-09-30 06:15:59.776746108 +0000 UTC m=+2830.103866633" watchObservedRunningTime="2025-09-30 06:15:59.779946608 +0000 UTC m=+2830.107067133" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.195547 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.375605 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-config\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.375913 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2jh8\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-kube-api-access-f2jh8\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.376773 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-tls-assets\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.376827 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.376863 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-secret-combined-ca-bundle\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.376984 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.377085 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.377229 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de710cda-e9a2-426f-a617-2ac08ef16386-config-out\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.377261 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.377283 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/de710cda-e9a2-426f-a617-2ac08ef16386-prometheus-metric-storage-rulefiles-0\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.377300 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-thanos-prometheus-http-client-file\") pod \"de710cda-e9a2-426f-a617-2ac08ef16386\" (UID: \"de710cda-e9a2-426f-a617-2ac08ef16386\") " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.379487 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de710cda-e9a2-426f-a617-2ac08ef16386-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.381783 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-config" (OuterVolumeSpecName: "config") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.381834 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.383254 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-kube-api-access-f2jh8" (OuterVolumeSpecName: "kube-api-access-f2jh8") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "kube-api-access-f2jh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.383746 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de710cda-e9a2-426f-a617-2ac08ef16386-config-out" (OuterVolumeSpecName: "config-out") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.390686 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.390757 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.391171 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.392273 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.422474 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.464381 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config" (OuterVolumeSpecName: "web-config") pod "de710cda-e9a2-426f-a617-2ac08ef16386" (UID: "de710cda-e9a2-426f-a617-2ac08ef16386"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480005 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2jh8\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-kube-api-access-f2jh8\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480036 4956 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de710cda-e9a2-426f-a617-2ac08ef16386-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480046 4956 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480054 4956 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480064 4956 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480090 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") on node \"crc\" " Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480100 4956 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de710cda-e9a2-426f-a617-2ac08ef16386-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480124 4956 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480136 4956 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480145 4956 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/de710cda-e9a2-426f-a617-2ac08ef16386-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.480157 4956 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de710cda-e9a2-426f-a617-2ac08ef16386-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.505177 4956 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.505915 4956 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423") on node "crc" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.581754 4956 reconciler_common.go:293] "Volume detached for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.768566 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"de710cda-e9a2-426f-a617-2ac08ef16386","Type":"ContainerDied","Data":"0d4bf75f5af38c98c5535b6e021653db444a593f86649a701a155705f2767eba"} Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.768634 4956 scope.go:117] "RemoveContainer" containerID="97f6c0624224989180f4175a911599b14468c1c98c87cd5acfbfb0bfaeb53233" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.769482 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.797317 4956 scope.go:117] "RemoveContainer" containerID="83eb93e1d04ae7cee3cd63476ad6e49e021a57bbe3ff75ebba7a0fddd5c8e5ea" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.811892 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.822268 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.825008 4956 scope.go:117] "RemoveContainer" containerID="19c74458ef26b83bc07dab14c067ef515cc7ce8a2ed066c8d46e4a9a1b05745b" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.863146 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.863430 4956 scope.go:117] "RemoveContainer" containerID="7d209536397595a3a62721c93fdf9641064a7710faa4ab0c47cafd7144855ca9" Sep 30 06:16:00 crc kubenswrapper[4956]: E0930 06:16:00.863666 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="config-reloader" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.863687 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="config-reloader" Sep 30 06:16:00 crc kubenswrapper[4956]: E0930 06:16:00.863724 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="thanos-sidecar" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.863733 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="thanos-sidecar" Sep 30 06:16:00 crc kubenswrapper[4956]: E0930 06:16:00.863756 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="init-config-reloader" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.863767 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="init-config-reloader" Sep 30 06:16:00 crc kubenswrapper[4956]: E0930 06:16:00.863782 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="prometheus" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.863790 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="prometheus" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.864091 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="thanos-sidecar" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.864138 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="prometheus" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.864174 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" containerName="config-reloader" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.879000 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.879126 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.882388 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.882874 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.884575 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wmk85" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.884580 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.884787 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.898031 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.990296 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.990622 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.990668 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.990695 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/949e372a-c9a8-4db6-a275-512d0236dd01-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.990749 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.990779 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.990935 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-config\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.991088 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/949e372a-c9a8-4db6-a275-512d0236dd01-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.991329 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.991473 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr49w\" (UniqueName: \"kubernetes.io/projected/949e372a-c9a8-4db6-a275-512d0236dd01-kube-api-access-rr49w\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:00 crc kubenswrapper[4956]: I0930 06:16:00.991786 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/949e372a-c9a8-4db6-a275-512d0236dd01-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.093485 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.093762 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.094385 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-config\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.094521 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/949e372a-c9a8-4db6-a275-512d0236dd01-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.094645 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.094788 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr49w\" (UniqueName: \"kubernetes.io/projected/949e372a-c9a8-4db6-a275-512d0236dd01-kube-api-access-rr49w\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.094924 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/949e372a-c9a8-4db6-a275-512d0236dd01-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.095065 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.095185 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.095284 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.095370 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/949e372a-c9a8-4db6-a275-512d0236dd01-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.096531 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/949e372a-c9a8-4db6-a275-512d0236dd01-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.099229 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.099844 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.100021 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/949e372a-c9a8-4db6-a275-512d0236dd01-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.100743 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.101213 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.101767 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.108707 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/949e372a-c9a8-4db6-a275-512d0236dd01-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.109981 4956 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.110041 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dd69768781dcae5046075cd89d0649334b1721ac4e99f4c4ea45de29b58bc7b6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.116469 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr49w\" (UniqueName: \"kubernetes.io/projected/949e372a-c9a8-4db6-a275-512d0236dd01-kube-api-access-rr49w\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.127111 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/949e372a-c9a8-4db6-a275-512d0236dd01-config\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.187197 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41fe8871-e454-4ec5-9b9d-eb85f6351423\") pod \"prometheus-metric-storage-0\" (UID: \"949e372a-c9a8-4db6-a275-512d0236dd01\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.273398 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.742177 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:16:01 crc kubenswrapper[4956]: I0930 06:16:01.778202 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"949e372a-c9a8-4db6-a275-512d0236dd01","Type":"ContainerStarted","Data":"200851821e33bd88ea559c447a7a2d0da3afb81cef0b8b77fed3995b0a805433"} Sep 30 06:16:02 crc kubenswrapper[4956]: I0930 06:16:02.370652 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de710cda-e9a2-426f-a617-2ac08ef16386" path="/var/lib/kubelet/pods/de710cda-e9a2-426f-a617-2ac08ef16386/volumes" Sep 30 06:16:05 crc kubenswrapper[4956]: I0930 06:16:05.823938 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"949e372a-c9a8-4db6-a275-512d0236dd01","Type":"ContainerStarted","Data":"a908d3c1cc168808602dbbe8ffd3a106a7c19b21e6f23e7175bda4d5dc829d01"} Sep 30 06:16:06 crc kubenswrapper[4956]: I0930 06:16:06.023370 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:16:06 crc kubenswrapper[4956]: I0930 06:16:06.023552 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:16:06 crc kubenswrapper[4956]: I0930 06:16:06.081212 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:16:06 crc kubenswrapper[4956]: I0930 06:16:06.906043 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:16:06 crc kubenswrapper[4956]: I0930 06:16:06.969595 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s8vz6"] Sep 30 06:16:08 crc kubenswrapper[4956]: I0930 06:16:08.862056 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s8vz6" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerName="registry-server" containerID="cri-o://107f01d09bae69691e00c7f25140d7d914ea30cda11e7de85f9572bcc01e78b2" gracePeriod=2 Sep 30 06:16:09 crc kubenswrapper[4956]: I0930 06:16:09.876086 4956 generic.go:334] "Generic (PLEG): container finished" podID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerID="107f01d09bae69691e00c7f25140d7d914ea30cda11e7de85f9572bcc01e78b2" exitCode=0 Sep 30 06:16:09 crc kubenswrapper[4956]: I0930 06:16:09.877221 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8vz6" event={"ID":"9159453a-2fc2-4806-b7b3-10a148f2326d","Type":"ContainerDied","Data":"107f01d09bae69691e00c7f25140d7d914ea30cda11e7de85f9572bcc01e78b2"} Sep 30 06:16:09 crc kubenswrapper[4956]: I0930 06:16:09.959881 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.035102 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-utilities\") pod \"9159453a-2fc2-4806-b7b3-10a148f2326d\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.035363 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdlnz\" (UniqueName: \"kubernetes.io/projected/9159453a-2fc2-4806-b7b3-10a148f2326d-kube-api-access-rdlnz\") pod \"9159453a-2fc2-4806-b7b3-10a148f2326d\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.035579 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-catalog-content\") pod \"9159453a-2fc2-4806-b7b3-10a148f2326d\" (UID: \"9159453a-2fc2-4806-b7b3-10a148f2326d\") " Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.036368 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-utilities" (OuterVolumeSpecName: "utilities") pod "9159453a-2fc2-4806-b7b3-10a148f2326d" (UID: "9159453a-2fc2-4806-b7b3-10a148f2326d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.045286 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9159453a-2fc2-4806-b7b3-10a148f2326d-kube-api-access-rdlnz" (OuterVolumeSpecName: "kube-api-access-rdlnz") pod "9159453a-2fc2-4806-b7b3-10a148f2326d" (UID: "9159453a-2fc2-4806-b7b3-10a148f2326d"). InnerVolumeSpecName "kube-api-access-rdlnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.092935 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9159453a-2fc2-4806-b7b3-10a148f2326d" (UID: "9159453a-2fc2-4806-b7b3-10a148f2326d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.136995 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.137024 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9159453a-2fc2-4806-b7b3-10a148f2326d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.137036 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdlnz\" (UniqueName: \"kubernetes.io/projected/9159453a-2fc2-4806-b7b3-10a148f2326d-kube-api-access-rdlnz\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.899193 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8vz6" event={"ID":"9159453a-2fc2-4806-b7b3-10a148f2326d","Type":"ContainerDied","Data":"92d70675a53934cf5b3c3e82733ccf2c764b7370426888f409b3042d310f16dc"} Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.899573 4956 scope.go:117] "RemoveContainer" containerID="107f01d09bae69691e00c7f25140d7d914ea30cda11e7de85f9572bcc01e78b2" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.899294 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8vz6" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.942592 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s8vz6"] Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.961747 4956 scope.go:117] "RemoveContainer" containerID="aaccc3921bf5c369a164c93c0fb56a9e3f6f1ed2cbc1da9fd34d143349b587c1" Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.961837 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s8vz6"] Sep 30 06:16:10 crc kubenswrapper[4956]: I0930 06:16:10.990631 4956 scope.go:117] "RemoveContainer" containerID="36318d79015882bda46e4ab030b5664c2a584fadd06c061dc72fcb6fd26c50f6" Sep 30 06:16:12 crc kubenswrapper[4956]: I0930 06:16:12.352795 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" path="/var/lib/kubelet/pods/9159453a-2fc2-4806-b7b3-10a148f2326d/volumes" Sep 30 06:16:15 crc kubenswrapper[4956]: I0930 06:16:15.968821 4956 generic.go:334] "Generic (PLEG): container finished" podID="949e372a-c9a8-4db6-a275-512d0236dd01" containerID="a908d3c1cc168808602dbbe8ffd3a106a7c19b21e6f23e7175bda4d5dc829d01" exitCode=0 Sep 30 06:16:15 crc kubenswrapper[4956]: I0930 06:16:15.968924 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"949e372a-c9a8-4db6-a275-512d0236dd01","Type":"ContainerDied","Data":"a908d3c1cc168808602dbbe8ffd3a106a7c19b21e6f23e7175bda4d5dc829d01"} Sep 30 06:16:16 crc kubenswrapper[4956]: I0930 06:16:16.982762 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"949e372a-c9a8-4db6-a275-512d0236dd01","Type":"ContainerStarted","Data":"fed81ce10551fbb2c421b9104c8cbfedb642ea43e09bb40871ab9eb0927faea2"} Sep 30 06:16:18 crc kubenswrapper[4956]: I0930 06:16:18.073367 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:16:18 crc kubenswrapper[4956]: I0930 06:16:18.073452 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:16:18 crc kubenswrapper[4956]: I0930 06:16:18.073512 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:16:18 crc kubenswrapper[4956]: I0930 06:16:18.074617 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:16:18 crc kubenswrapper[4956]: I0930 06:16:18.074719 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" gracePeriod=600 Sep 30 06:16:18 crc kubenswrapper[4956]: E0930 06:16:18.204935 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:16:19 crc kubenswrapper[4956]: I0930 06:16:19.003862 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" exitCode=0 Sep 30 06:16:19 crc kubenswrapper[4956]: I0930 06:16:19.003896 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b"} Sep 30 06:16:19 crc kubenswrapper[4956]: I0930 06:16:19.004238 4956 scope.go:117] "RemoveContainer" containerID="4dc15bfc3f84595554d9630720800f9c8292e405d7c3920824417d9ee8c558cc" Sep 30 06:16:19 crc kubenswrapper[4956]: I0930 06:16:19.004868 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:16:19 crc kubenswrapper[4956]: E0930 06:16:19.005160 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:16:20 crc kubenswrapper[4956]: I0930 06:16:20.016294 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"949e372a-c9a8-4db6-a275-512d0236dd01","Type":"ContainerStarted","Data":"253a4394ef5d7131177047f3138cb5b38468b257eb978a5be4dd29355ae1858f"} Sep 30 06:16:20 crc kubenswrapper[4956]: I0930 06:16:20.016537 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"949e372a-c9a8-4db6-a275-512d0236dd01","Type":"ContainerStarted","Data":"b3f4593aa02ebc067916a48a78f0fd40d8db2562567e9f834de3764d9007d038"} Sep 30 06:16:20 crc kubenswrapper[4956]: I0930 06:16:20.042964 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.042947086 podStartE2EDuration="20.042947086s" podCreationTimestamp="2025-09-30 06:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:16:20.03703465 +0000 UTC m=+2850.364155175" watchObservedRunningTime="2025-09-30 06:16:20.042947086 +0000 UTC m=+2850.370067611" Sep 30 06:16:21 crc kubenswrapper[4956]: I0930 06:16:21.274686 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.314625 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7crww"] Sep 30 06:16:23 crc kubenswrapper[4956]: E0930 06:16:23.316461 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerName="extract-utilities" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.316801 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerName="extract-utilities" Sep 30 06:16:23 crc kubenswrapper[4956]: E0930 06:16:23.316959 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerName="registry-server" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.317064 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerName="registry-server" Sep 30 06:16:23 crc kubenswrapper[4956]: E0930 06:16:23.317229 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerName="extract-content" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.317348 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerName="extract-content" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.317808 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="9159453a-2fc2-4806-b7b3-10a148f2326d" containerName="registry-server" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.320383 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.371950 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7crww"] Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.409578 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-catalog-content\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.409667 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph249\" (UniqueName: \"kubernetes.io/projected/8eb0a0bc-6048-4891-8ba5-330a4cb35811-kube-api-access-ph249\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.409807 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-utilities\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.511051 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-catalog-content\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.511106 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph249\" (UniqueName: \"kubernetes.io/projected/8eb0a0bc-6048-4891-8ba5-330a4cb35811-kube-api-access-ph249\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.511216 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-utilities\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.511636 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-utilities\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.511840 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-catalog-content\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.542035 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph249\" (UniqueName: \"kubernetes.io/projected/8eb0a0bc-6048-4891-8ba5-330a4cb35811-kube-api-access-ph249\") pod \"community-operators-7crww\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:23 crc kubenswrapper[4956]: I0930 06:16:23.701319 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:24 crc kubenswrapper[4956]: I0930 06:16:24.215422 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7crww"] Sep 30 06:16:25 crc kubenswrapper[4956]: I0930 06:16:25.082592 4956 generic.go:334] "Generic (PLEG): container finished" podID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerID="58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af" exitCode=0 Sep 30 06:16:25 crc kubenswrapper[4956]: I0930 06:16:25.082801 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7crww" event={"ID":"8eb0a0bc-6048-4891-8ba5-330a4cb35811","Type":"ContainerDied","Data":"58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af"} Sep 30 06:16:25 crc kubenswrapper[4956]: I0930 06:16:25.083517 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7crww" event={"ID":"8eb0a0bc-6048-4891-8ba5-330a4cb35811","Type":"ContainerStarted","Data":"09fb70a00a3b871c0505f060acdd0c9573b35c2f19691b7bab9fd3f819d2696d"} Sep 30 06:16:26 crc kubenswrapper[4956]: I0930 06:16:26.097004 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7crww" event={"ID":"8eb0a0bc-6048-4891-8ba5-330a4cb35811","Type":"ContainerStarted","Data":"89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793"} Sep 30 06:16:27 crc kubenswrapper[4956]: I0930 06:16:27.106551 4956 generic.go:334] "Generic (PLEG): container finished" podID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerID="89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793" exitCode=0 Sep 30 06:16:27 crc kubenswrapper[4956]: I0930 06:16:27.106803 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7crww" event={"ID":"8eb0a0bc-6048-4891-8ba5-330a4cb35811","Type":"ContainerDied","Data":"89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793"} Sep 30 06:16:28 crc kubenswrapper[4956]: I0930 06:16:28.115867 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7crww" event={"ID":"8eb0a0bc-6048-4891-8ba5-330a4cb35811","Type":"ContainerStarted","Data":"c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37"} Sep 30 06:16:28 crc kubenswrapper[4956]: I0930 06:16:28.142623 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7crww" podStartSLOduration=2.4723302 podStartE2EDuration="5.142605387s" podCreationTimestamp="2025-09-30 06:16:23 +0000 UTC" firstStartedPulling="2025-09-30 06:16:25.08619207 +0000 UTC m=+2855.413312605" lastFinishedPulling="2025-09-30 06:16:27.756467217 +0000 UTC m=+2858.083587792" observedRunningTime="2025-09-30 06:16:28.132205201 +0000 UTC m=+2858.459325746" watchObservedRunningTime="2025-09-30 06:16:28.142605387 +0000 UTC m=+2858.469725912" Sep 30 06:16:30 crc kubenswrapper[4956]: I0930 06:16:30.355463 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:16:30 crc kubenswrapper[4956]: E0930 06:16:30.358646 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:16:31 crc kubenswrapper[4956]: I0930 06:16:31.274418 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:31 crc kubenswrapper[4956]: I0930 06:16:31.280549 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:32 crc kubenswrapper[4956]: I0930 06:16:32.152307 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 06:16:33 crc kubenswrapper[4956]: I0930 06:16:33.701808 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:33 crc kubenswrapper[4956]: I0930 06:16:33.702139 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:33 crc kubenswrapper[4956]: I0930 06:16:33.765650 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:34 crc kubenswrapper[4956]: I0930 06:16:34.227270 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:35 crc kubenswrapper[4956]: I0930 06:16:35.287738 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7crww"] Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.186973 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7crww" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerName="registry-server" containerID="cri-o://c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37" gracePeriod=2 Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.702704 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.810839 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-utilities\") pod \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.810917 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-catalog-content\") pod \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.811178 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph249\" (UniqueName: \"kubernetes.io/projected/8eb0a0bc-6048-4891-8ba5-330a4cb35811-kube-api-access-ph249\") pod \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\" (UID: \"8eb0a0bc-6048-4891-8ba5-330a4cb35811\") " Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.811762 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-utilities" (OuterVolumeSpecName: "utilities") pod "8eb0a0bc-6048-4891-8ba5-330a4cb35811" (UID: "8eb0a0bc-6048-4891-8ba5-330a4cb35811"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.811890 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.816400 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb0a0bc-6048-4891-8ba5-330a4cb35811-kube-api-access-ph249" (OuterVolumeSpecName: "kube-api-access-ph249") pod "8eb0a0bc-6048-4891-8ba5-330a4cb35811" (UID: "8eb0a0bc-6048-4891-8ba5-330a4cb35811"). InnerVolumeSpecName "kube-api-access-ph249". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.857500 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8eb0a0bc-6048-4891-8ba5-330a4cb35811" (UID: "8eb0a0bc-6048-4891-8ba5-330a4cb35811"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.914045 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph249\" (UniqueName: \"kubernetes.io/projected/8eb0a0bc-6048-4891-8ba5-330a4cb35811-kube-api-access-ph249\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:36 crc kubenswrapper[4956]: I0930 06:16:36.914083 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eb0a0bc-6048-4891-8ba5-330a4cb35811-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.199582 4956 generic.go:334] "Generic (PLEG): container finished" podID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerID="c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37" exitCode=0 Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.199642 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7crww" event={"ID":"8eb0a0bc-6048-4891-8ba5-330a4cb35811","Type":"ContainerDied","Data":"c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37"} Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.199692 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7crww" event={"ID":"8eb0a0bc-6048-4891-8ba5-330a4cb35811","Type":"ContainerDied","Data":"09fb70a00a3b871c0505f060acdd0c9573b35c2f19691b7bab9fd3f819d2696d"} Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.199710 4956 scope.go:117] "RemoveContainer" containerID="c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.199713 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7crww" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.226441 4956 scope.go:117] "RemoveContainer" containerID="89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.253747 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7crww"] Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.266433 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7crww"] Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.283821 4956 scope.go:117] "RemoveContainer" containerID="58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.312336 4956 scope.go:117] "RemoveContainer" containerID="c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37" Sep 30 06:16:37 crc kubenswrapper[4956]: E0930 06:16:37.312771 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37\": container with ID starting with c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37 not found: ID does not exist" containerID="c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.312809 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37"} err="failed to get container status \"c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37\": rpc error: code = NotFound desc = could not find container \"c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37\": container with ID starting with c0f4b5164085f2b298b9bc1ddc66219b0572b0aa5a864ba71eaec6bb4c785e37 not found: ID does not exist" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.312834 4956 scope.go:117] "RemoveContainer" containerID="89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793" Sep 30 06:16:37 crc kubenswrapper[4956]: E0930 06:16:37.313081 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793\": container with ID starting with 89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793 not found: ID does not exist" containerID="89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.313111 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793"} err="failed to get container status \"89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793\": rpc error: code = NotFound desc = could not find container \"89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793\": container with ID starting with 89d166ec4c6efb2687512893be73071a5ea33480268856303bf000ba6b7c1793 not found: ID does not exist" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.313142 4956 scope.go:117] "RemoveContainer" containerID="58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af" Sep 30 06:16:37 crc kubenswrapper[4956]: E0930 06:16:37.313595 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af\": container with ID starting with 58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af not found: ID does not exist" containerID="58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af" Sep 30 06:16:37 crc kubenswrapper[4956]: I0930 06:16:37.313688 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af"} err="failed to get container status \"58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af\": rpc error: code = NotFound desc = could not find container \"58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af\": container with ID starting with 58fe7184d3fa7ea525153494b69e47618ca8b6e509c3030e474166600a77d7af not found: ID does not exist" Sep 30 06:16:38 crc kubenswrapper[4956]: I0930 06:16:38.386625 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" path="/var/lib/kubelet/pods/8eb0a0bc-6048-4891-8ba5-330a4cb35811/volumes" Sep 30 06:16:42 crc kubenswrapper[4956]: I0930 06:16:42.342250 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:16:42 crc kubenswrapper[4956]: E0930 06:16:42.343296 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.269594 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 06:16:53 crc kubenswrapper[4956]: E0930 06:16:53.271005 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerName="extract-utilities" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.271038 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerName="extract-utilities" Sep 30 06:16:53 crc kubenswrapper[4956]: E0930 06:16:53.271072 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerName="registry-server" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.271088 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerName="registry-server" Sep 30 06:16:53 crc kubenswrapper[4956]: E0930 06:16:53.271161 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerName="extract-content" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.271177 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerName="extract-content" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.271573 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb0a0bc-6048-4891-8ba5-330a4cb35811" containerName="registry-server" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.272756 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.275719 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.280282 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kdgrq" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.280378 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.280470 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.288862 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.385594 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.385667 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.385721 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.385748 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.385764 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.385783 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.385863 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-config-data\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.385921 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pltrb\" (UniqueName: \"kubernetes.io/projected/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-kube-api-access-pltrb\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.387262 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.490284 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.490412 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.490513 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.490568 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.490623 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.490655 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.490694 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.490959 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-config-data\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.491082 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pltrb\" (UniqueName: \"kubernetes.io/projected/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-kube-api-access-pltrb\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.491276 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.491339 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.491648 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.492098 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.493216 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-config-data\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.498644 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.498902 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.507032 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.511001 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pltrb\" (UniqueName: \"kubernetes.io/projected/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-kube-api-access-pltrb\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.528721 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " pod="openstack/tempest-tests-tempest" Sep 30 06:16:53 crc kubenswrapper[4956]: I0930 06:16:53.638635 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 06:16:54 crc kubenswrapper[4956]: I0930 06:16:54.107192 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 06:16:54 crc kubenswrapper[4956]: W0930 06:16:54.115615 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc822eb6a_ddf6_44d5_8a3c_35408a3a0f69.slice/crio-220592b4358ee1a1456cf05ae73ff297df2a78f03c6c5a4bb8d0d8b909119ce6 WatchSource:0}: Error finding container 220592b4358ee1a1456cf05ae73ff297df2a78f03c6c5a4bb8d0d8b909119ce6: Status 404 returned error can't find the container with id 220592b4358ee1a1456cf05ae73ff297df2a78f03c6c5a4bb8d0d8b909119ce6 Sep 30 06:16:54 crc kubenswrapper[4956]: I0930 06:16:54.424791 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69","Type":"ContainerStarted","Data":"220592b4358ee1a1456cf05ae73ff297df2a78f03c6c5a4bb8d0d8b909119ce6"} Sep 30 06:16:55 crc kubenswrapper[4956]: I0930 06:16:55.340595 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:16:55 crc kubenswrapper[4956]: E0930 06:16:55.341086 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:17:08 crc kubenswrapper[4956]: I0930 06:17:08.343196 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:17:08 crc kubenswrapper[4956]: E0930 06:17:08.344063 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:17:10 crc kubenswrapper[4956]: I0930 06:17:10.638072 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69","Type":"ContainerStarted","Data":"7416284bc5173a03ea9d1994287c83158f503af3d81d45dc8efb894751fe25dc"} Sep 30 06:17:10 crc kubenswrapper[4956]: I0930 06:17:10.660888 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.42937278 podStartE2EDuration="18.6608624s" podCreationTimestamp="2025-09-30 06:16:52 +0000 UTC" firstStartedPulling="2025-09-30 06:16:54.118097623 +0000 UTC m=+2884.445218148" lastFinishedPulling="2025-09-30 06:17:09.349587233 +0000 UTC m=+2899.676707768" observedRunningTime="2025-09-30 06:17:10.656288266 +0000 UTC m=+2900.983408791" watchObservedRunningTime="2025-09-30 06:17:10.6608624 +0000 UTC m=+2900.987982945" Sep 30 06:17:21 crc kubenswrapper[4956]: I0930 06:17:21.341826 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:17:21 crc kubenswrapper[4956]: E0930 06:17:21.342618 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:17:35 crc kubenswrapper[4956]: I0930 06:17:35.341590 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:17:35 crc kubenswrapper[4956]: E0930 06:17:35.342834 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:17:49 crc kubenswrapper[4956]: I0930 06:17:49.340924 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:17:49 crc kubenswrapper[4956]: E0930 06:17:49.341781 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:18:00 crc kubenswrapper[4956]: I0930 06:18:00.347682 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:18:00 crc kubenswrapper[4956]: E0930 06:18:00.348534 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:18:12 crc kubenswrapper[4956]: I0930 06:18:12.340876 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:18:12 crc kubenswrapper[4956]: E0930 06:18:12.341555 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.003860 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jtpdx"] Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.007969 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.015764 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtpdx"] Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.149541 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h66ts\" (UniqueName: \"kubernetes.io/projected/641d195b-fbfe-4c71-953a-ce5a7c6891f1-kube-api-access-h66ts\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.149883 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-utilities\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.150048 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-catalog-content\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.251896 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-utilities\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.251991 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-catalog-content\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.252041 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h66ts\" (UniqueName: \"kubernetes.io/projected/641d195b-fbfe-4c71-953a-ce5a7c6891f1-kube-api-access-h66ts\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.252612 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-catalog-content\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.252623 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-utilities\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.282288 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h66ts\" (UniqueName: \"kubernetes.io/projected/641d195b-fbfe-4c71-953a-ce5a7c6891f1-kube-api-access-h66ts\") pod \"redhat-marketplace-jtpdx\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.337646 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:18 crc kubenswrapper[4956]: I0930 06:18:18.793844 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtpdx"] Sep 30 06:18:19 crc kubenswrapper[4956]: I0930 06:18:19.377877 4956 generic.go:334] "Generic (PLEG): container finished" podID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerID="4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7" exitCode=0 Sep 30 06:18:19 crc kubenswrapper[4956]: I0930 06:18:19.378210 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtpdx" event={"ID":"641d195b-fbfe-4c71-953a-ce5a7c6891f1","Type":"ContainerDied","Data":"4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7"} Sep 30 06:18:19 crc kubenswrapper[4956]: I0930 06:18:19.378237 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtpdx" event={"ID":"641d195b-fbfe-4c71-953a-ce5a7c6891f1","Type":"ContainerStarted","Data":"f3e1af7791cdff2b38a52c5ba2fef09747e938be85eec0cf67b93ce989e9341d"} Sep 30 06:18:21 crc kubenswrapper[4956]: I0930 06:18:21.401071 4956 generic.go:334] "Generic (PLEG): container finished" podID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerID="7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7" exitCode=0 Sep 30 06:18:21 crc kubenswrapper[4956]: I0930 06:18:21.401207 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtpdx" event={"ID":"641d195b-fbfe-4c71-953a-ce5a7c6891f1","Type":"ContainerDied","Data":"7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7"} Sep 30 06:18:22 crc kubenswrapper[4956]: I0930 06:18:22.418035 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtpdx" event={"ID":"641d195b-fbfe-4c71-953a-ce5a7c6891f1","Type":"ContainerStarted","Data":"a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae"} Sep 30 06:18:22 crc kubenswrapper[4956]: I0930 06:18:22.444545 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jtpdx" podStartSLOduration=2.9732673099999998 podStartE2EDuration="5.444518135s" podCreationTimestamp="2025-09-30 06:18:17 +0000 UTC" firstStartedPulling="2025-09-30 06:18:19.380629454 +0000 UTC m=+2969.707749979" lastFinishedPulling="2025-09-30 06:18:21.851880279 +0000 UTC m=+2972.179000804" observedRunningTime="2025-09-30 06:18:22.435325457 +0000 UTC m=+2972.762446022" watchObservedRunningTime="2025-09-30 06:18:22.444518135 +0000 UTC m=+2972.771638710" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.204846 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-55xft"] Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.209016 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.217103 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bklvw\" (UniqueName: \"kubernetes.io/projected/486721af-3893-4de3-8a42-efdcbcc78dfe-kube-api-access-bklvw\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.217380 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-catalog-content\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.217527 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-utilities\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.228100 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-55xft"] Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.319444 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-catalog-content\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.319890 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-utilities\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.320034 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bklvw\" (UniqueName: \"kubernetes.io/projected/486721af-3893-4de3-8a42-efdcbcc78dfe-kube-api-access-bklvw\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.320264 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-catalog-content\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.320512 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-utilities\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.349171 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bklvw\" (UniqueName: \"kubernetes.io/projected/486721af-3893-4de3-8a42-efdcbcc78dfe-kube-api-access-bklvw\") pod \"redhat-operators-55xft\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:25 crc kubenswrapper[4956]: I0930 06:18:25.556931 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:26 crc kubenswrapper[4956]: I0930 06:18:26.084014 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-55xft"] Sep 30 06:18:26 crc kubenswrapper[4956]: I0930 06:18:26.341700 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:18:26 crc kubenswrapper[4956]: E0930 06:18:26.342236 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:18:26 crc kubenswrapper[4956]: I0930 06:18:26.456927 4956 generic.go:334] "Generic (PLEG): container finished" podID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerID="af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82" exitCode=0 Sep 30 06:18:26 crc kubenswrapper[4956]: I0930 06:18:26.456980 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55xft" event={"ID":"486721af-3893-4de3-8a42-efdcbcc78dfe","Type":"ContainerDied","Data":"af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82"} Sep 30 06:18:26 crc kubenswrapper[4956]: I0930 06:18:26.457009 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55xft" event={"ID":"486721af-3893-4de3-8a42-efdcbcc78dfe","Type":"ContainerStarted","Data":"501a0c51ee4d7a13c4fdb7de3205f3d563b155a065777f4ab159fdb09743efc1"} Sep 30 06:18:27 crc kubenswrapper[4956]: I0930 06:18:27.467102 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55xft" event={"ID":"486721af-3893-4de3-8a42-efdcbcc78dfe","Type":"ContainerStarted","Data":"06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012"} Sep 30 06:18:28 crc kubenswrapper[4956]: I0930 06:18:28.337802 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:28 crc kubenswrapper[4956]: I0930 06:18:28.337909 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:28 crc kubenswrapper[4956]: I0930 06:18:28.393989 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:28 crc kubenswrapper[4956]: I0930 06:18:28.518859 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:29 crc kubenswrapper[4956]: I0930 06:18:29.490303 4956 generic.go:334] "Generic (PLEG): container finished" podID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerID="06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012" exitCode=0 Sep 30 06:18:29 crc kubenswrapper[4956]: I0930 06:18:29.490404 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55xft" event={"ID":"486721af-3893-4de3-8a42-efdcbcc78dfe","Type":"ContainerDied","Data":"06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012"} Sep 30 06:18:30 crc kubenswrapper[4956]: I0930 06:18:30.785474 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtpdx"] Sep 30 06:18:31 crc kubenswrapper[4956]: I0930 06:18:31.514509 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55xft" event={"ID":"486721af-3893-4de3-8a42-efdcbcc78dfe","Type":"ContainerStarted","Data":"b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d"} Sep 30 06:18:31 crc kubenswrapper[4956]: I0930 06:18:31.514673 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jtpdx" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerName="registry-server" containerID="cri-o://a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae" gracePeriod=2 Sep 30 06:18:31 crc kubenswrapper[4956]: I0930 06:18:31.538394 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-55xft" podStartSLOduration=2.206621568 podStartE2EDuration="6.538373627s" podCreationTimestamp="2025-09-30 06:18:25 +0000 UTC" firstStartedPulling="2025-09-30 06:18:26.459308299 +0000 UTC m=+2976.786428824" lastFinishedPulling="2025-09-30 06:18:30.791060318 +0000 UTC m=+2981.118180883" observedRunningTime="2025-09-30 06:18:31.534100732 +0000 UTC m=+2981.861221267" watchObservedRunningTime="2025-09-30 06:18:31.538373627 +0000 UTC m=+2981.865494152" Sep 30 06:18:31 crc kubenswrapper[4956]: I0930 06:18:31.949810 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.051462 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h66ts\" (UniqueName: \"kubernetes.io/projected/641d195b-fbfe-4c71-953a-ce5a7c6891f1-kube-api-access-h66ts\") pod \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.051649 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-catalog-content\") pod \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.051713 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-utilities\") pod \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\" (UID: \"641d195b-fbfe-4c71-953a-ce5a7c6891f1\") " Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.052840 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-utilities" (OuterVolumeSpecName: "utilities") pod "641d195b-fbfe-4c71-953a-ce5a7c6891f1" (UID: "641d195b-fbfe-4c71-953a-ce5a7c6891f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.064902 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "641d195b-fbfe-4c71-953a-ce5a7c6891f1" (UID: "641d195b-fbfe-4c71-953a-ce5a7c6891f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.065457 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/641d195b-fbfe-4c71-953a-ce5a7c6891f1-kube-api-access-h66ts" (OuterVolumeSpecName: "kube-api-access-h66ts") pod "641d195b-fbfe-4c71-953a-ce5a7c6891f1" (UID: "641d195b-fbfe-4c71-953a-ce5a7c6891f1"). InnerVolumeSpecName "kube-api-access-h66ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.154847 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.154897 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/641d195b-fbfe-4c71-953a-ce5a7c6891f1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.154918 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h66ts\" (UniqueName: \"kubernetes.io/projected/641d195b-fbfe-4c71-953a-ce5a7c6891f1-kube-api-access-h66ts\") on node \"crc\" DevicePath \"\"" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.527764 4956 generic.go:334] "Generic (PLEG): container finished" podID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerID="a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae" exitCode=0 Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.527810 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtpdx" event={"ID":"641d195b-fbfe-4c71-953a-ce5a7c6891f1","Type":"ContainerDied","Data":"a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae"} Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.528006 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jtpdx" event={"ID":"641d195b-fbfe-4c71-953a-ce5a7c6891f1","Type":"ContainerDied","Data":"f3e1af7791cdff2b38a52c5ba2fef09747e938be85eec0cf67b93ce989e9341d"} Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.528028 4956 scope.go:117] "RemoveContainer" containerID="a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.527865 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jtpdx" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.549777 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtpdx"] Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.551953 4956 scope.go:117] "RemoveContainer" containerID="7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.557241 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jtpdx"] Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.577230 4956 scope.go:117] "RemoveContainer" containerID="4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.631214 4956 scope.go:117] "RemoveContainer" containerID="a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae" Sep 30 06:18:32 crc kubenswrapper[4956]: E0930 06:18:32.631719 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae\": container with ID starting with a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae not found: ID does not exist" containerID="a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.631755 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae"} err="failed to get container status \"a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae\": rpc error: code = NotFound desc = could not find container \"a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae\": container with ID starting with a39a79b838168c9f7981c13ab768ede39fbf1297415ce0ec2353e919b6431dae not found: ID does not exist" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.631774 4956 scope.go:117] "RemoveContainer" containerID="7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7" Sep 30 06:18:32 crc kubenswrapper[4956]: E0930 06:18:32.632096 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7\": container with ID starting with 7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7 not found: ID does not exist" containerID="7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.632144 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7"} err="failed to get container status \"7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7\": rpc error: code = NotFound desc = could not find container \"7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7\": container with ID starting with 7047ed730efaf9e36a1036dc5e83536af9707fe2c470ccc73574ef322f0ea2e7 not found: ID does not exist" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.632157 4956 scope.go:117] "RemoveContainer" containerID="4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7" Sep 30 06:18:32 crc kubenswrapper[4956]: E0930 06:18:32.632450 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7\": container with ID starting with 4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7 not found: ID does not exist" containerID="4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7" Sep 30 06:18:32 crc kubenswrapper[4956]: I0930 06:18:32.632474 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7"} err="failed to get container status \"4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7\": rpc error: code = NotFound desc = could not find container \"4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7\": container with ID starting with 4b0ad566d913f753af134243786c2a374cd81af7c7f55fdc43f65a9bc29759b7 not found: ID does not exist" Sep 30 06:18:34 crc kubenswrapper[4956]: I0930 06:18:34.353006 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" path="/var/lib/kubelet/pods/641d195b-fbfe-4c71-953a-ce5a7c6891f1/volumes" Sep 30 06:18:35 crc kubenswrapper[4956]: I0930 06:18:35.557351 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:35 crc kubenswrapper[4956]: I0930 06:18:35.557425 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:36 crc kubenswrapper[4956]: I0930 06:18:36.612856 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-55xft" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="registry-server" probeResult="failure" output=< Sep 30 06:18:36 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Sep 30 06:18:36 crc kubenswrapper[4956]: > Sep 30 06:18:40 crc kubenswrapper[4956]: I0930 06:18:40.348960 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:18:40 crc kubenswrapper[4956]: E0930 06:18:40.349524 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:18:45 crc kubenswrapper[4956]: I0930 06:18:45.614364 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:45 crc kubenswrapper[4956]: I0930 06:18:45.659535 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:45 crc kubenswrapper[4956]: I0930 06:18:45.854073 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-55xft"] Sep 30 06:18:46 crc kubenswrapper[4956]: I0930 06:18:46.699880 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-55xft" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="registry-server" containerID="cri-o://b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d" gracePeriod=2 Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.152406 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.285548 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-catalog-content\") pod \"486721af-3893-4de3-8a42-efdcbcc78dfe\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.285660 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-utilities\") pod \"486721af-3893-4de3-8a42-efdcbcc78dfe\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.285731 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bklvw\" (UniqueName: \"kubernetes.io/projected/486721af-3893-4de3-8a42-efdcbcc78dfe-kube-api-access-bklvw\") pod \"486721af-3893-4de3-8a42-efdcbcc78dfe\" (UID: \"486721af-3893-4de3-8a42-efdcbcc78dfe\") " Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.287225 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-utilities" (OuterVolumeSpecName: "utilities") pod "486721af-3893-4de3-8a42-efdcbcc78dfe" (UID: "486721af-3893-4de3-8a42-efdcbcc78dfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.291456 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486721af-3893-4de3-8a42-efdcbcc78dfe-kube-api-access-bklvw" (OuterVolumeSpecName: "kube-api-access-bklvw") pod "486721af-3893-4de3-8a42-efdcbcc78dfe" (UID: "486721af-3893-4de3-8a42-efdcbcc78dfe"). InnerVolumeSpecName "kube-api-access-bklvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.366747 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "486721af-3893-4de3-8a42-efdcbcc78dfe" (UID: "486721af-3893-4de3-8a42-efdcbcc78dfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.387710 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.387740 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486721af-3893-4de3-8a42-efdcbcc78dfe-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.387751 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bklvw\" (UniqueName: \"kubernetes.io/projected/486721af-3893-4de3-8a42-efdcbcc78dfe-kube-api-access-bklvw\") on node \"crc\" DevicePath \"\"" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.718901 4956 generic.go:334] "Generic (PLEG): container finished" podID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerID="b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d" exitCode=0 Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.719011 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55xft" event={"ID":"486721af-3893-4de3-8a42-efdcbcc78dfe","Type":"ContainerDied","Data":"b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d"} Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.719366 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55xft" event={"ID":"486721af-3893-4de3-8a42-efdcbcc78dfe","Type":"ContainerDied","Data":"501a0c51ee4d7a13c4fdb7de3205f3d563b155a065777f4ab159fdb09743efc1"} Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.719402 4956 scope.go:117] "RemoveContainer" containerID="b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.719028 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55xft" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.750526 4956 scope.go:117] "RemoveContainer" containerID="06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.772921 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-55xft"] Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.788812 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-55xft"] Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.791253 4956 scope.go:117] "RemoveContainer" containerID="af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.839936 4956 scope.go:117] "RemoveContainer" containerID="b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d" Sep 30 06:18:47 crc kubenswrapper[4956]: E0930 06:18:47.840487 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d\": container with ID starting with b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d not found: ID does not exist" containerID="b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.840563 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d"} err="failed to get container status \"b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d\": rpc error: code = NotFound desc = could not find container \"b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d\": container with ID starting with b9a59d5c309602ac52299713a63e0a0668f4555770c8a7fad81b86705b92233d not found: ID does not exist" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.840601 4956 scope.go:117] "RemoveContainer" containerID="06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012" Sep 30 06:18:47 crc kubenswrapper[4956]: E0930 06:18:47.841055 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012\": container with ID starting with 06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012 not found: ID does not exist" containerID="06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.841099 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012"} err="failed to get container status \"06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012\": rpc error: code = NotFound desc = could not find container \"06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012\": container with ID starting with 06ec3a7ae919afd0c5cd00a33f4944da86197a292937c8629c498effe3c00012 not found: ID does not exist" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.841230 4956 scope.go:117] "RemoveContainer" containerID="af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82" Sep 30 06:18:47 crc kubenswrapper[4956]: E0930 06:18:47.841622 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82\": container with ID starting with af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82 not found: ID does not exist" containerID="af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82" Sep 30 06:18:47 crc kubenswrapper[4956]: I0930 06:18:47.841696 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82"} err="failed to get container status \"af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82\": rpc error: code = NotFound desc = could not find container \"af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82\": container with ID starting with af0a720f6c798897e96642a387a0087167b1a1e53d4f6abeeab4317a91cd1b82 not found: ID does not exist" Sep 30 06:18:48 crc kubenswrapper[4956]: I0930 06:18:48.363237 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" path="/var/lib/kubelet/pods/486721af-3893-4de3-8a42-efdcbcc78dfe/volumes" Sep 30 06:18:54 crc kubenswrapper[4956]: I0930 06:18:54.342315 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:18:54 crc kubenswrapper[4956]: E0930 06:18:54.344012 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:19:08 crc kubenswrapper[4956]: I0930 06:19:08.342870 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:19:08 crc kubenswrapper[4956]: E0930 06:19:08.343727 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:19:20 crc kubenswrapper[4956]: I0930 06:19:20.348995 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:19:20 crc kubenswrapper[4956]: E0930 06:19:20.349968 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:19:32 crc kubenswrapper[4956]: I0930 06:19:32.342100 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:19:32 crc kubenswrapper[4956]: E0930 06:19:32.343390 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:19:45 crc kubenswrapper[4956]: I0930 06:19:45.341660 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:19:45 crc kubenswrapper[4956]: E0930 06:19:45.342755 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:19:58 crc kubenswrapper[4956]: I0930 06:19:58.341667 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:19:58 crc kubenswrapper[4956]: E0930 06:19:58.343789 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:20:09 crc kubenswrapper[4956]: I0930 06:20:09.341889 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:20:09 crc kubenswrapper[4956]: E0930 06:20:09.342714 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:20:22 crc kubenswrapper[4956]: I0930 06:20:22.341718 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:20:22 crc kubenswrapper[4956]: E0930 06:20:22.342584 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:20:37 crc kubenswrapper[4956]: I0930 06:20:37.341750 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:20:37 crc kubenswrapper[4956]: E0930 06:20:37.342472 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:20:52 crc kubenswrapper[4956]: I0930 06:20:52.342361 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:20:52 crc kubenswrapper[4956]: E0930 06:20:52.343095 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:21:05 crc kubenswrapper[4956]: I0930 06:21:05.341828 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:21:05 crc kubenswrapper[4956]: E0930 06:21:05.342578 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:21:16 crc kubenswrapper[4956]: I0930 06:21:16.341836 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:21:16 crc kubenswrapper[4956]: E0930 06:21:16.342677 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:21:28 crc kubenswrapper[4956]: I0930 06:21:28.342390 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:21:29 crc kubenswrapper[4956]: I0930 06:21:29.533956 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"bb8c77a6aea676cc7bdc48431c29c16ecc01e72230b921ff0bad9f838a24da4e"} Sep 30 06:23:48 crc kubenswrapper[4956]: I0930 06:23:48.073217 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:23:48 crc kubenswrapper[4956]: I0930 06:23:48.073714 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:24:18 crc kubenswrapper[4956]: I0930 06:24:18.073077 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:24:18 crc kubenswrapper[4956]: I0930 06:24:18.073649 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.074053 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.074624 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.074673 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.075426 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb8c77a6aea676cc7bdc48431c29c16ecc01e72230b921ff0bad9f838a24da4e"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.075481 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://bb8c77a6aea676cc7bdc48431c29c16ecc01e72230b921ff0bad9f838a24da4e" gracePeriod=600 Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.663293 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="bb8c77a6aea676cc7bdc48431c29c16ecc01e72230b921ff0bad9f838a24da4e" exitCode=0 Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.663365 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"bb8c77a6aea676cc7bdc48431c29c16ecc01e72230b921ff0bad9f838a24da4e"} Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.663669 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c"} Sep 30 06:24:48 crc kubenswrapper[4956]: I0930 06:24:48.663693 4956 scope.go:117] "RemoveContainer" containerID="d1c0df27bdae32e103fababb1b9599c5dd545dc0f760be8eaf2aed4a3209577b" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.572270 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xzpwd"] Sep 30 06:26:40 crc kubenswrapper[4956]: E0930 06:26:40.574033 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerName="extract-content" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.574059 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerName="extract-content" Sep 30 06:26:40 crc kubenswrapper[4956]: E0930 06:26:40.574080 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="extract-content" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.574093 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="extract-content" Sep 30 06:26:40 crc kubenswrapper[4956]: E0930 06:26:40.574148 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerName="extract-utilities" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.574163 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerName="extract-utilities" Sep 30 06:26:40 crc kubenswrapper[4956]: E0930 06:26:40.574218 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="registry-server" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.574230 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="registry-server" Sep 30 06:26:40 crc kubenswrapper[4956]: E0930 06:26:40.574254 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="extract-utilities" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.574265 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="extract-utilities" Sep 30 06:26:40 crc kubenswrapper[4956]: E0930 06:26:40.574285 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerName="registry-server" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.574297 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerName="registry-server" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.574647 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="486721af-3893-4de3-8a42-efdcbcc78dfe" containerName="registry-server" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.574688 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="641d195b-fbfe-4c71-953a-ce5a7c6891f1" containerName="registry-server" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.579669 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.592163 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xzpwd"] Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.636355 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-catalog-content\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.636507 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-utilities\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.636749 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlj64\" (UniqueName: \"kubernetes.io/projected/4238d37e-073f-4f09-b892-6a01530a8cbf-kube-api-access-hlj64\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.738547 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlj64\" (UniqueName: \"kubernetes.io/projected/4238d37e-073f-4f09-b892-6a01530a8cbf-kube-api-access-hlj64\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.738992 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-catalog-content\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.739033 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-utilities\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.739621 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-catalog-content\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.739689 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-utilities\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.769689 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlj64\" (UniqueName: \"kubernetes.io/projected/4238d37e-073f-4f09-b892-6a01530a8cbf-kube-api-access-hlj64\") pod \"community-operators-xzpwd\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:40 crc kubenswrapper[4956]: I0930 06:26:40.912821 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:41 crc kubenswrapper[4956]: I0930 06:26:41.442915 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xzpwd"] Sep 30 06:26:41 crc kubenswrapper[4956]: I0930 06:26:41.993241 4956 generic.go:334] "Generic (PLEG): container finished" podID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerID="e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7" exitCode=0 Sep 30 06:26:41 crc kubenswrapper[4956]: I0930 06:26:41.993394 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzpwd" event={"ID":"4238d37e-073f-4f09-b892-6a01530a8cbf","Type":"ContainerDied","Data":"e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7"} Sep 30 06:26:41 crc kubenswrapper[4956]: I0930 06:26:41.993597 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzpwd" event={"ID":"4238d37e-073f-4f09-b892-6a01530a8cbf","Type":"ContainerStarted","Data":"05f59aef08a1bfe2344807399448ed5f7e31fef19e672aac74d0eb56972d4f75"} Sep 30 06:26:41 crc kubenswrapper[4956]: I0930 06:26:41.995523 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:26:43 crc kubenswrapper[4956]: I0930 06:26:43.007341 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzpwd" event={"ID":"4238d37e-073f-4f09-b892-6a01530a8cbf","Type":"ContainerStarted","Data":"b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc"} Sep 30 06:26:44 crc kubenswrapper[4956]: I0930 06:26:44.022454 4956 generic.go:334] "Generic (PLEG): container finished" podID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerID="b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc" exitCode=0 Sep 30 06:26:44 crc kubenswrapper[4956]: I0930 06:26:44.022927 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzpwd" event={"ID":"4238d37e-073f-4f09-b892-6a01530a8cbf","Type":"ContainerDied","Data":"b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc"} Sep 30 06:26:45 crc kubenswrapper[4956]: I0930 06:26:45.036476 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzpwd" event={"ID":"4238d37e-073f-4f09-b892-6a01530a8cbf","Type":"ContainerStarted","Data":"0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413"} Sep 30 06:26:45 crc kubenswrapper[4956]: I0930 06:26:45.063081 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xzpwd" podStartSLOduration=2.621904706 podStartE2EDuration="5.063055688s" podCreationTimestamp="2025-09-30 06:26:40 +0000 UTC" firstStartedPulling="2025-09-30 06:26:41.995242467 +0000 UTC m=+3472.322363012" lastFinishedPulling="2025-09-30 06:26:44.436393469 +0000 UTC m=+3474.763513994" observedRunningTime="2025-09-30 06:26:45.052779915 +0000 UTC m=+3475.379900460" watchObservedRunningTime="2025-09-30 06:26:45.063055688 +0000 UTC m=+3475.390176223" Sep 30 06:26:48 crc kubenswrapper[4956]: I0930 06:26:48.073313 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:26:48 crc kubenswrapper[4956]: I0930 06:26:48.073867 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:26:50 crc kubenswrapper[4956]: I0930 06:26:50.913042 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:50 crc kubenswrapper[4956]: I0930 06:26:50.913639 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:50 crc kubenswrapper[4956]: I0930 06:26:50.974093 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:51 crc kubenswrapper[4956]: I0930 06:26:51.134620 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:51 crc kubenswrapper[4956]: I0930 06:26:51.206032 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xzpwd"] Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.108011 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xzpwd" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerName="registry-server" containerID="cri-o://0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413" gracePeriod=2 Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.617848 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.721488 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-utilities\") pod \"4238d37e-073f-4f09-b892-6a01530a8cbf\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.721827 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-catalog-content\") pod \"4238d37e-073f-4f09-b892-6a01530a8cbf\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.721954 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlj64\" (UniqueName: \"kubernetes.io/projected/4238d37e-073f-4f09-b892-6a01530a8cbf-kube-api-access-hlj64\") pod \"4238d37e-073f-4f09-b892-6a01530a8cbf\" (UID: \"4238d37e-073f-4f09-b892-6a01530a8cbf\") " Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.722969 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-utilities" (OuterVolumeSpecName: "utilities") pod "4238d37e-073f-4f09-b892-6a01530a8cbf" (UID: "4238d37e-073f-4f09-b892-6a01530a8cbf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.733748 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4238d37e-073f-4f09-b892-6a01530a8cbf-kube-api-access-hlj64" (OuterVolumeSpecName: "kube-api-access-hlj64") pod "4238d37e-073f-4f09-b892-6a01530a8cbf" (UID: "4238d37e-073f-4f09-b892-6a01530a8cbf"). InnerVolumeSpecName "kube-api-access-hlj64". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.787048 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4238d37e-073f-4f09-b892-6a01530a8cbf" (UID: "4238d37e-073f-4f09-b892-6a01530a8cbf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.825022 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlj64\" (UniqueName: \"kubernetes.io/projected/4238d37e-073f-4f09-b892-6a01530a8cbf-kube-api-access-hlj64\") on node \"crc\" DevicePath \"\"" Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.825078 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:26:53 crc kubenswrapper[4956]: I0930 06:26:53.825099 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4238d37e-073f-4f09-b892-6a01530a8cbf-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.126104 4956 generic.go:334] "Generic (PLEG): container finished" podID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerID="0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413" exitCode=0 Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.126209 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzpwd" event={"ID":"4238d37e-073f-4f09-b892-6a01530a8cbf","Type":"ContainerDied","Data":"0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413"} Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.126259 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzpwd" event={"ID":"4238d37e-073f-4f09-b892-6a01530a8cbf","Type":"ContainerDied","Data":"05f59aef08a1bfe2344807399448ed5f7e31fef19e672aac74d0eb56972d4f75"} Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.126292 4956 scope.go:117] "RemoveContainer" containerID="0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.127845 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzpwd" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.164941 4956 scope.go:117] "RemoveContainer" containerID="b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.186759 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xzpwd"] Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.198826 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xzpwd"] Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.217989 4956 scope.go:117] "RemoveContainer" containerID="e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.264272 4956 scope.go:117] "RemoveContainer" containerID="0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413" Sep 30 06:26:54 crc kubenswrapper[4956]: E0930 06:26:54.264779 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413\": container with ID starting with 0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413 not found: ID does not exist" containerID="0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.264815 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413"} err="failed to get container status \"0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413\": rpc error: code = NotFound desc = could not find container \"0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413\": container with ID starting with 0e16847ad281848f221dcc6a1a8fc16d9a916c7f9bd2a0e29bd00ae727feb413 not found: ID does not exist" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.264845 4956 scope.go:117] "RemoveContainer" containerID="b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc" Sep 30 06:26:54 crc kubenswrapper[4956]: E0930 06:26:54.265341 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc\": container with ID starting with b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc not found: ID does not exist" containerID="b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.265374 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc"} err="failed to get container status \"b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc\": rpc error: code = NotFound desc = could not find container \"b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc\": container with ID starting with b90682764e9653eb55db765ebadfc36b967865836e64df00398260e6670794fc not found: ID does not exist" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.265395 4956 scope.go:117] "RemoveContainer" containerID="e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7" Sep 30 06:26:54 crc kubenswrapper[4956]: E0930 06:26:54.265756 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7\": container with ID starting with e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7 not found: ID does not exist" containerID="e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.265853 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7"} err="failed to get container status \"e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7\": rpc error: code = NotFound desc = could not find container \"e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7\": container with ID starting with e3af142e3224b3f66b0ac51645752b54a38d5bbbd70d5299fe91bc1e878f49f7 not found: ID does not exist" Sep 30 06:26:54 crc kubenswrapper[4956]: I0930 06:26:54.356033 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" path="/var/lib/kubelet/pods/4238d37e-073f-4f09-b892-6a01530a8cbf/volumes" Sep 30 06:27:18 crc kubenswrapper[4956]: I0930 06:27:18.073269 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:27:18 crc kubenswrapper[4956]: I0930 06:27:18.073778 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.073355 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.075374 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.075593 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.076594 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.077097 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" gracePeriod=600 Sep 30 06:27:48 crc kubenswrapper[4956]: E0930 06:27:48.210917 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.741519 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" exitCode=0 Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.741587 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c"} Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.741628 4956 scope.go:117] "RemoveContainer" containerID="bb8c77a6aea676cc7bdc48431c29c16ecc01e72230b921ff0bad9f838a24da4e" Sep 30 06:27:48 crc kubenswrapper[4956]: I0930 06:27:48.742441 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:27:48 crc kubenswrapper[4956]: E0930 06:27:48.743655 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:28:02 crc kubenswrapper[4956]: I0930 06:28:02.341734 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:28:02 crc kubenswrapper[4956]: E0930 06:28:02.342782 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:28:13 crc kubenswrapper[4956]: I0930 06:28:13.342588 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:28:13 crc kubenswrapper[4956]: E0930 06:28:13.343384 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:28:28 crc kubenswrapper[4956]: I0930 06:28:28.341644 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:28:28 crc kubenswrapper[4956]: E0930 06:28:28.343071 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:28:43 crc kubenswrapper[4956]: I0930 06:28:43.346633 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:28:43 crc kubenswrapper[4956]: E0930 06:28:43.353302 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.239834 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sctgn"] Sep 30 06:28:47 crc kubenswrapper[4956]: E0930 06:28:47.241140 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerName="extract-utilities" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.241165 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerName="extract-utilities" Sep 30 06:28:47 crc kubenswrapper[4956]: E0930 06:28:47.241189 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerName="extract-content" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.241201 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerName="extract-content" Sep 30 06:28:47 crc kubenswrapper[4956]: E0930 06:28:47.241251 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerName="registry-server" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.241261 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerName="registry-server" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.241586 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4238d37e-073f-4f09-b892-6a01530a8cbf" containerName="registry-server" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.243995 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.258827 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sctgn"] Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.341139 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-catalog-content\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.341428 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72lpx\" (UniqueName: \"kubernetes.io/projected/34490908-06a0-4c23-bcb7-b1c06159d592-kube-api-access-72lpx\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.341502 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-utilities\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.458521 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-catalog-content\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.458621 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72lpx\" (UniqueName: \"kubernetes.io/projected/34490908-06a0-4c23-bcb7-b1c06159d592-kube-api-access-72lpx\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.458748 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-utilities\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.459958 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-catalog-content\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.460427 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-utilities\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.482275 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72lpx\" (UniqueName: \"kubernetes.io/projected/34490908-06a0-4c23-bcb7-b1c06159d592-kube-api-access-72lpx\") pod \"redhat-operators-sctgn\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:47 crc kubenswrapper[4956]: I0930 06:28:47.575170 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:48 crc kubenswrapper[4956]: I0930 06:28:48.082384 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sctgn"] Sep 30 06:28:48 crc kubenswrapper[4956]: I0930 06:28:48.396354 4956 generic.go:334] "Generic (PLEG): container finished" podID="34490908-06a0-4c23-bcb7-b1c06159d592" containerID="b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7" exitCode=0 Sep 30 06:28:48 crc kubenswrapper[4956]: I0930 06:28:48.396415 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sctgn" event={"ID":"34490908-06a0-4c23-bcb7-b1c06159d592","Type":"ContainerDied","Data":"b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7"} Sep 30 06:28:48 crc kubenswrapper[4956]: I0930 06:28:48.396608 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sctgn" event={"ID":"34490908-06a0-4c23-bcb7-b1c06159d592","Type":"ContainerStarted","Data":"8f4920f8936ce8f063b09c446e0dbd76c141e0a20f70405ab136e86457b1d413"} Sep 30 06:28:50 crc kubenswrapper[4956]: I0930 06:28:50.420065 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sctgn" event={"ID":"34490908-06a0-4c23-bcb7-b1c06159d592","Type":"ContainerStarted","Data":"a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a"} Sep 30 06:28:51 crc kubenswrapper[4956]: I0930 06:28:51.464806 4956 generic.go:334] "Generic (PLEG): container finished" podID="34490908-06a0-4c23-bcb7-b1c06159d592" containerID="a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a" exitCode=0 Sep 30 06:28:51 crc kubenswrapper[4956]: I0930 06:28:51.464871 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sctgn" event={"ID":"34490908-06a0-4c23-bcb7-b1c06159d592","Type":"ContainerDied","Data":"a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a"} Sep 30 06:28:52 crc kubenswrapper[4956]: I0930 06:28:52.476860 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sctgn" event={"ID":"34490908-06a0-4c23-bcb7-b1c06159d592","Type":"ContainerStarted","Data":"2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d"} Sep 30 06:28:52 crc kubenswrapper[4956]: I0930 06:28:52.500647 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sctgn" podStartSLOduration=1.894262871 podStartE2EDuration="5.500632276s" podCreationTimestamp="2025-09-30 06:28:47 +0000 UTC" firstStartedPulling="2025-09-30 06:28:48.398450146 +0000 UTC m=+3598.725570661" lastFinishedPulling="2025-09-30 06:28:52.004819511 +0000 UTC m=+3602.331940066" observedRunningTime="2025-09-30 06:28:52.497556469 +0000 UTC m=+3602.824676994" watchObservedRunningTime="2025-09-30 06:28:52.500632276 +0000 UTC m=+3602.827752801" Sep 30 06:28:56 crc kubenswrapper[4956]: I0930 06:28:56.341019 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:28:56 crc kubenswrapper[4956]: E0930 06:28:56.341838 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:28:57 crc kubenswrapper[4956]: I0930 06:28:57.575940 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:57 crc kubenswrapper[4956]: I0930 06:28:57.576338 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:57 crc kubenswrapper[4956]: I0930 06:28:57.642235 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:58 crc kubenswrapper[4956]: I0930 06:28:58.621931 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:28:58 crc kubenswrapper[4956]: I0930 06:28:58.679682 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sctgn"] Sep 30 06:29:00 crc kubenswrapper[4956]: I0930 06:29:00.580385 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sctgn" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" containerName="registry-server" containerID="cri-o://2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d" gracePeriod=2 Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.045896 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.141084 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-utilities\") pod \"34490908-06a0-4c23-bcb7-b1c06159d592\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.141293 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72lpx\" (UniqueName: \"kubernetes.io/projected/34490908-06a0-4c23-bcb7-b1c06159d592-kube-api-access-72lpx\") pod \"34490908-06a0-4c23-bcb7-b1c06159d592\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.141327 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-catalog-content\") pod \"34490908-06a0-4c23-bcb7-b1c06159d592\" (UID: \"34490908-06a0-4c23-bcb7-b1c06159d592\") " Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.142083 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-utilities" (OuterVolumeSpecName: "utilities") pod "34490908-06a0-4c23-bcb7-b1c06159d592" (UID: "34490908-06a0-4c23-bcb7-b1c06159d592"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.147338 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34490908-06a0-4c23-bcb7-b1c06159d592-kube-api-access-72lpx" (OuterVolumeSpecName: "kube-api-access-72lpx") pod "34490908-06a0-4c23-bcb7-b1c06159d592" (UID: "34490908-06a0-4c23-bcb7-b1c06159d592"). InnerVolumeSpecName "kube-api-access-72lpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.224764 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34490908-06a0-4c23-bcb7-b1c06159d592" (UID: "34490908-06a0-4c23-bcb7-b1c06159d592"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.243518 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.243716 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72lpx\" (UniqueName: \"kubernetes.io/projected/34490908-06a0-4c23-bcb7-b1c06159d592-kube-api-access-72lpx\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.243806 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34490908-06a0-4c23-bcb7-b1c06159d592-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.592097 4956 generic.go:334] "Generic (PLEG): container finished" podID="34490908-06a0-4c23-bcb7-b1c06159d592" containerID="2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d" exitCode=0 Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.592161 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sctgn" event={"ID":"34490908-06a0-4c23-bcb7-b1c06159d592","Type":"ContainerDied","Data":"2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d"} Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.592188 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sctgn" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.592211 4956 scope.go:117] "RemoveContainer" containerID="2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.592198 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sctgn" event={"ID":"34490908-06a0-4c23-bcb7-b1c06159d592","Type":"ContainerDied","Data":"8f4920f8936ce8f063b09c446e0dbd76c141e0a20f70405ab136e86457b1d413"} Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.615188 4956 scope.go:117] "RemoveContainer" containerID="a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.646642 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sctgn"] Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.656657 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sctgn"] Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.658446 4956 scope.go:117] "RemoveContainer" containerID="b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.690005 4956 scope.go:117] "RemoveContainer" containerID="2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d" Sep 30 06:29:01 crc kubenswrapper[4956]: E0930 06:29:01.690490 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d\": container with ID starting with 2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d not found: ID does not exist" containerID="2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.690531 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d"} err="failed to get container status \"2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d\": rpc error: code = NotFound desc = could not find container \"2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d\": container with ID starting with 2f7aae6ade4a69e88892fe7725696090cd252f1b927af55fb5ee834e7dc66a0d not found: ID does not exist" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.690558 4956 scope.go:117] "RemoveContainer" containerID="a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a" Sep 30 06:29:01 crc kubenswrapper[4956]: E0930 06:29:01.690775 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a\": container with ID starting with a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a not found: ID does not exist" containerID="a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.690798 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a"} err="failed to get container status \"a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a\": rpc error: code = NotFound desc = could not find container \"a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a\": container with ID starting with a8200344fc8d21fa4c1d6bd56c3f88dc905e304138e10da0e174bc3dba17868a not found: ID does not exist" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.690812 4956 scope.go:117] "RemoveContainer" containerID="b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7" Sep 30 06:29:01 crc kubenswrapper[4956]: E0930 06:29:01.691057 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7\": container with ID starting with b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7 not found: ID does not exist" containerID="b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7" Sep 30 06:29:01 crc kubenswrapper[4956]: I0930 06:29:01.691080 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7"} err="failed to get container status \"b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7\": rpc error: code = NotFound desc = could not find container \"b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7\": container with ID starting with b305c1211dc7ac6f67c60122aee92150b6d11dc4951a568ff5d7434de4c631b7 not found: ID does not exist" Sep 30 06:29:02 crc kubenswrapper[4956]: I0930 06:29:02.361745 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" path="/var/lib/kubelet/pods/34490908-06a0-4c23-bcb7-b1c06159d592/volumes" Sep 30 06:29:09 crc kubenswrapper[4956]: I0930 06:29:09.342452 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:29:09 crc kubenswrapper[4956]: E0930 06:29:09.343404 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.019767 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bscm5"] Sep 30 06:29:14 crc kubenswrapper[4956]: E0930 06:29:14.021773 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" containerName="registry-server" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.021873 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" containerName="registry-server" Sep 30 06:29:14 crc kubenswrapper[4956]: E0930 06:29:14.021956 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" containerName="extract-utilities" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.022038 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" containerName="extract-utilities" Sep 30 06:29:14 crc kubenswrapper[4956]: E0930 06:29:14.022167 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" containerName="extract-content" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.022248 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" containerName="extract-content" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.022602 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="34490908-06a0-4c23-bcb7-b1c06159d592" containerName="registry-server" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.024628 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.039116 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bscm5"] Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.118785 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-catalog-content\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.119104 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-utilities\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.119228 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrtpd\" (UniqueName: \"kubernetes.io/projected/f68b1f7d-3765-4e97-954f-7a7153aab08a-kube-api-access-vrtpd\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.221317 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-catalog-content\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.221387 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-utilities\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.221456 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrtpd\" (UniqueName: \"kubernetes.io/projected/f68b1f7d-3765-4e97-954f-7a7153aab08a-kube-api-access-vrtpd\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.222015 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-catalog-content\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.222023 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-utilities\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.250786 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrtpd\" (UniqueName: \"kubernetes.io/projected/f68b1f7d-3765-4e97-954f-7a7153aab08a-kube-api-access-vrtpd\") pod \"redhat-marketplace-bscm5\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.368062 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:14 crc kubenswrapper[4956]: I0930 06:29:14.883771 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bscm5"] Sep 30 06:29:15 crc kubenswrapper[4956]: I0930 06:29:15.769700 4956 generic.go:334] "Generic (PLEG): container finished" podID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerID="e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42" exitCode=0 Sep 30 06:29:15 crc kubenswrapper[4956]: I0930 06:29:15.769747 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bscm5" event={"ID":"f68b1f7d-3765-4e97-954f-7a7153aab08a","Type":"ContainerDied","Data":"e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42"} Sep 30 06:29:15 crc kubenswrapper[4956]: I0930 06:29:15.770630 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bscm5" event={"ID":"f68b1f7d-3765-4e97-954f-7a7153aab08a","Type":"ContainerStarted","Data":"917cfcdf15552ca0cb2248a6cfa41ef51a9f8c6d916994eb59907c6dc617f72f"} Sep 30 06:29:17 crc kubenswrapper[4956]: I0930 06:29:17.792213 4956 generic.go:334] "Generic (PLEG): container finished" podID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerID="a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c" exitCode=0 Sep 30 06:29:17 crc kubenswrapper[4956]: I0930 06:29:17.793019 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bscm5" event={"ID":"f68b1f7d-3765-4e97-954f-7a7153aab08a","Type":"ContainerDied","Data":"a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c"} Sep 30 06:29:18 crc kubenswrapper[4956]: I0930 06:29:18.807217 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bscm5" event={"ID":"f68b1f7d-3765-4e97-954f-7a7153aab08a","Type":"ContainerStarted","Data":"8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33"} Sep 30 06:29:18 crc kubenswrapper[4956]: I0930 06:29:18.830405 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bscm5" podStartSLOduration=3.414302801 podStartE2EDuration="5.830388026s" podCreationTimestamp="2025-09-30 06:29:13 +0000 UTC" firstStartedPulling="2025-09-30 06:29:15.773772427 +0000 UTC m=+3626.100892952" lastFinishedPulling="2025-09-30 06:29:18.189857652 +0000 UTC m=+3628.516978177" observedRunningTime="2025-09-30 06:29:18.824831992 +0000 UTC m=+3629.151952517" watchObservedRunningTime="2025-09-30 06:29:18.830388026 +0000 UTC m=+3629.157508551" Sep 30 06:29:23 crc kubenswrapper[4956]: I0930 06:29:23.341764 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:29:23 crc kubenswrapper[4956]: E0930 06:29:23.344768 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:29:24 crc kubenswrapper[4956]: I0930 06:29:24.368957 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:24 crc kubenswrapper[4956]: I0930 06:29:24.369002 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:24 crc kubenswrapper[4956]: I0930 06:29:24.422624 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:24 crc kubenswrapper[4956]: I0930 06:29:24.916795 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:24 crc kubenswrapper[4956]: I0930 06:29:24.972204 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bscm5"] Sep 30 06:29:26 crc kubenswrapper[4956]: I0930 06:29:26.891281 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bscm5" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerName="registry-server" containerID="cri-o://8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33" gracePeriod=2 Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.403603 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.514104 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-catalog-content\") pod \"f68b1f7d-3765-4e97-954f-7a7153aab08a\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.514363 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-utilities\") pod \"f68b1f7d-3765-4e97-954f-7a7153aab08a\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.514482 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrtpd\" (UniqueName: \"kubernetes.io/projected/f68b1f7d-3765-4e97-954f-7a7153aab08a-kube-api-access-vrtpd\") pod \"f68b1f7d-3765-4e97-954f-7a7153aab08a\" (UID: \"f68b1f7d-3765-4e97-954f-7a7153aab08a\") " Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.515516 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-utilities" (OuterVolumeSpecName: "utilities") pod "f68b1f7d-3765-4e97-954f-7a7153aab08a" (UID: "f68b1f7d-3765-4e97-954f-7a7153aab08a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.523377 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68b1f7d-3765-4e97-954f-7a7153aab08a-kube-api-access-vrtpd" (OuterVolumeSpecName: "kube-api-access-vrtpd") pod "f68b1f7d-3765-4e97-954f-7a7153aab08a" (UID: "f68b1f7d-3765-4e97-954f-7a7153aab08a"). InnerVolumeSpecName "kube-api-access-vrtpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.531244 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f68b1f7d-3765-4e97-954f-7a7153aab08a" (UID: "f68b1f7d-3765-4e97-954f-7a7153aab08a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.617713 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.617756 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f68b1f7d-3765-4e97-954f-7a7153aab08a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.617772 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrtpd\" (UniqueName: \"kubernetes.io/projected/f68b1f7d-3765-4e97-954f-7a7153aab08a-kube-api-access-vrtpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.904152 4956 generic.go:334] "Generic (PLEG): container finished" podID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerID="8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33" exitCode=0 Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.904270 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bscm5" event={"ID":"f68b1f7d-3765-4e97-954f-7a7153aab08a","Type":"ContainerDied","Data":"8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33"} Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.904654 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bscm5" event={"ID":"f68b1f7d-3765-4e97-954f-7a7153aab08a","Type":"ContainerDied","Data":"917cfcdf15552ca0cb2248a6cfa41ef51a9f8c6d916994eb59907c6dc617f72f"} Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.904692 4956 scope.go:117] "RemoveContainer" containerID="8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.904322 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bscm5" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.933789 4956 scope.go:117] "RemoveContainer" containerID="a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c" Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.987924 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bscm5"] Sep 30 06:29:27 crc kubenswrapper[4956]: I0930 06:29:27.997776 4956 scope.go:117] "RemoveContainer" containerID="e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42" Sep 30 06:29:28 crc kubenswrapper[4956]: I0930 06:29:28.005053 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bscm5"] Sep 30 06:29:28 crc kubenswrapper[4956]: I0930 06:29:28.046775 4956 scope.go:117] "RemoveContainer" containerID="8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33" Sep 30 06:29:28 crc kubenswrapper[4956]: E0930 06:29:28.047438 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33\": container with ID starting with 8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33 not found: ID does not exist" containerID="8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33" Sep 30 06:29:28 crc kubenswrapper[4956]: I0930 06:29:28.047500 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33"} err="failed to get container status \"8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33\": rpc error: code = NotFound desc = could not find container \"8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33\": container with ID starting with 8f866f555617968a9a673878988447f35c16946ffdeaeea5dd7383b6b77b0d33 not found: ID does not exist" Sep 30 06:29:28 crc kubenswrapper[4956]: I0930 06:29:28.047529 4956 scope.go:117] "RemoveContainer" containerID="a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c" Sep 30 06:29:28 crc kubenswrapper[4956]: E0930 06:29:28.048022 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c\": container with ID starting with a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c not found: ID does not exist" containerID="a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c" Sep 30 06:29:28 crc kubenswrapper[4956]: I0930 06:29:28.048057 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c"} err="failed to get container status \"a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c\": rpc error: code = NotFound desc = could not find container \"a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c\": container with ID starting with a1c29c31ffa966f118ec52c0c1bccc030d569494b20ad974b7a0cb1f83b2c53c not found: ID does not exist" Sep 30 06:29:28 crc kubenswrapper[4956]: I0930 06:29:28.048098 4956 scope.go:117] "RemoveContainer" containerID="e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42" Sep 30 06:29:28 crc kubenswrapper[4956]: E0930 06:29:28.048439 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42\": container with ID starting with e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42 not found: ID does not exist" containerID="e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42" Sep 30 06:29:28 crc kubenswrapper[4956]: I0930 06:29:28.048478 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42"} err="failed to get container status \"e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42\": rpc error: code = NotFound desc = could not find container \"e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42\": container with ID starting with e40d8b2baa3c507e7cbe6d4b0a8e483d514fc9af993bab2d15eaba07d9525c42 not found: ID does not exist" Sep 30 06:29:28 crc kubenswrapper[4956]: I0930 06:29:28.366048 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" path="/var/lib/kubelet/pods/f68b1f7d-3765-4e97-954f-7a7153aab08a/volumes" Sep 30 06:29:37 crc kubenswrapper[4956]: I0930 06:29:37.341591 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:29:37 crc kubenswrapper[4956]: E0930 06:29:37.343395 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:29:48 crc kubenswrapper[4956]: I0930 06:29:48.341965 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:29:48 crc kubenswrapper[4956]: E0930 06:29:48.342890 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.187363 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6"] Sep 30 06:30:00 crc kubenswrapper[4956]: E0930 06:30:00.193365 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerName="extract-content" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.193555 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerName="extract-content" Sep 30 06:30:00 crc kubenswrapper[4956]: E0930 06:30:00.193637 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerName="extract-utilities" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.193697 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerName="extract-utilities" Sep 30 06:30:00 crc kubenswrapper[4956]: E0930 06:30:00.193753 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerName="registry-server" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.193813 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerName="registry-server" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.194689 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68b1f7d-3765-4e97-954f-7a7153aab08a" containerName="registry-server" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.196149 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.201533 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.212470 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.236508 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6"] Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.339363 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6831bf62-b6e7-420c-9f69-bce3e2921222-config-volume\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.339437 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6831bf62-b6e7-420c-9f69-bce3e2921222-secret-volume\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.339733 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49chs\" (UniqueName: \"kubernetes.io/projected/6831bf62-b6e7-420c-9f69-bce3e2921222-kube-api-access-49chs\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.442797 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6831bf62-b6e7-420c-9f69-bce3e2921222-config-volume\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.442887 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6831bf62-b6e7-420c-9f69-bce3e2921222-secret-volume\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.443091 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49chs\" (UniqueName: \"kubernetes.io/projected/6831bf62-b6e7-420c-9f69-bce3e2921222-kube-api-access-49chs\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.444309 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6831bf62-b6e7-420c-9f69-bce3e2921222-config-volume\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.449880 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6831bf62-b6e7-420c-9f69-bce3e2921222-secret-volume\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.460198 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49chs\" (UniqueName: \"kubernetes.io/projected/6831bf62-b6e7-420c-9f69-bce3e2921222-kube-api-access-49chs\") pod \"collect-profiles-29320230-v2pz6\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.552242 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:00 crc kubenswrapper[4956]: I0930 06:30:00.992406 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6"] Sep 30 06:30:01 crc kubenswrapper[4956]: I0930 06:30:01.252659 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" event={"ID":"6831bf62-b6e7-420c-9f69-bce3e2921222","Type":"ContainerStarted","Data":"e4a5e80e6551c1f52c1625352c1a80cf2c7b189dfe33c6922430681d85dc4f51"} Sep 30 06:30:01 crc kubenswrapper[4956]: I0930 06:30:01.252736 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" event={"ID":"6831bf62-b6e7-420c-9f69-bce3e2921222","Type":"ContainerStarted","Data":"7c2ca9b73dd8007f47e82366e821bc3d3e1db535a27210f97a8f03af15497ad2"} Sep 30 06:30:01 crc kubenswrapper[4956]: I0930 06:30:01.275350 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" podStartSLOduration=1.275329682 podStartE2EDuration="1.275329682s" podCreationTimestamp="2025-09-30 06:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:30:01.26731558 +0000 UTC m=+3671.594436125" watchObservedRunningTime="2025-09-30 06:30:01.275329682 +0000 UTC m=+3671.602450207" Sep 30 06:30:02 crc kubenswrapper[4956]: I0930 06:30:02.265438 4956 generic.go:334] "Generic (PLEG): container finished" podID="6831bf62-b6e7-420c-9f69-bce3e2921222" containerID="e4a5e80e6551c1f52c1625352c1a80cf2c7b189dfe33c6922430681d85dc4f51" exitCode=0 Sep 30 06:30:02 crc kubenswrapper[4956]: I0930 06:30:02.265644 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" event={"ID":"6831bf62-b6e7-420c-9f69-bce3e2921222","Type":"ContainerDied","Data":"e4a5e80e6551c1f52c1625352c1a80cf2c7b189dfe33c6922430681d85dc4f51"} Sep 30 06:30:03 crc kubenswrapper[4956]: I0930 06:30:03.341966 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:30:03 crc kubenswrapper[4956]: E0930 06:30:03.342849 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:30:03 crc kubenswrapper[4956]: I0930 06:30:03.765055 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:03 crc kubenswrapper[4956]: I0930 06:30:03.921170 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49chs\" (UniqueName: \"kubernetes.io/projected/6831bf62-b6e7-420c-9f69-bce3e2921222-kube-api-access-49chs\") pod \"6831bf62-b6e7-420c-9f69-bce3e2921222\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " Sep 30 06:30:03 crc kubenswrapper[4956]: I0930 06:30:03.921435 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6831bf62-b6e7-420c-9f69-bce3e2921222-config-volume\") pod \"6831bf62-b6e7-420c-9f69-bce3e2921222\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " Sep 30 06:30:03 crc kubenswrapper[4956]: I0930 06:30:03.921513 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6831bf62-b6e7-420c-9f69-bce3e2921222-secret-volume\") pod \"6831bf62-b6e7-420c-9f69-bce3e2921222\" (UID: \"6831bf62-b6e7-420c-9f69-bce3e2921222\") " Sep 30 06:30:03 crc kubenswrapper[4956]: I0930 06:30:03.926045 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6831bf62-b6e7-420c-9f69-bce3e2921222-config-volume" (OuterVolumeSpecName: "config-volume") pod "6831bf62-b6e7-420c-9f69-bce3e2921222" (UID: "6831bf62-b6e7-420c-9f69-bce3e2921222"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:30:03 crc kubenswrapper[4956]: I0930 06:30:03.929898 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6831bf62-b6e7-420c-9f69-bce3e2921222-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6831bf62-b6e7-420c-9f69-bce3e2921222" (UID: "6831bf62-b6e7-420c-9f69-bce3e2921222"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:30:03 crc kubenswrapper[4956]: I0930 06:30:03.937470 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6831bf62-b6e7-420c-9f69-bce3e2921222-kube-api-access-49chs" (OuterVolumeSpecName: "kube-api-access-49chs") pod "6831bf62-b6e7-420c-9f69-bce3e2921222" (UID: "6831bf62-b6e7-420c-9f69-bce3e2921222"). InnerVolumeSpecName "kube-api-access-49chs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:30:04 crc kubenswrapper[4956]: I0930 06:30:04.024860 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49chs\" (UniqueName: \"kubernetes.io/projected/6831bf62-b6e7-420c-9f69-bce3e2921222-kube-api-access-49chs\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:04 crc kubenswrapper[4956]: I0930 06:30:04.024937 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6831bf62-b6e7-420c-9f69-bce3e2921222-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:04 crc kubenswrapper[4956]: I0930 06:30:04.024966 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6831bf62-b6e7-420c-9f69-bce3e2921222-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:04 crc kubenswrapper[4956]: I0930 06:30:04.298025 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" event={"ID":"6831bf62-b6e7-420c-9f69-bce3e2921222","Type":"ContainerDied","Data":"7c2ca9b73dd8007f47e82366e821bc3d3e1db535a27210f97a8f03af15497ad2"} Sep 30 06:30:04 crc kubenswrapper[4956]: I0930 06:30:04.298077 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6" Sep 30 06:30:04 crc kubenswrapper[4956]: I0930 06:30:04.298102 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c2ca9b73dd8007f47e82366e821bc3d3e1db535a27210f97a8f03af15497ad2" Sep 30 06:30:04 crc kubenswrapper[4956]: I0930 06:30:04.380631 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk"] Sep 30 06:30:04 crc kubenswrapper[4956]: I0930 06:30:04.390170 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320185-xmjtk"] Sep 30 06:30:06 crc kubenswrapper[4956]: I0930 06:30:06.359943 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017b2353-9d72-4844-a21b-9dce833ae063" path="/var/lib/kubelet/pods/017b2353-9d72-4844-a21b-9dce833ae063/volumes" Sep 30 06:30:14 crc kubenswrapper[4956]: I0930 06:30:14.942472 4956 scope.go:117] "RemoveContainer" containerID="1f5a2fd296b771e30997d3629edd94c876285fe1b5e3c9d763d3b93267c3b487" Sep 30 06:30:15 crc kubenswrapper[4956]: I0930 06:30:15.341481 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:30:15 crc kubenswrapper[4956]: E0930 06:30:15.342049 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:30:29 crc kubenswrapper[4956]: I0930 06:30:29.341454 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:30:29 crc kubenswrapper[4956]: E0930 06:30:29.342768 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:30:41 crc kubenswrapper[4956]: I0930 06:30:41.341192 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:30:41 crc kubenswrapper[4956]: E0930 06:30:41.342122 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:30:56 crc kubenswrapper[4956]: I0930 06:30:56.341203 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:30:56 crc kubenswrapper[4956]: E0930 06:30:56.342205 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:31:10 crc kubenswrapper[4956]: I0930 06:31:10.349557 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:31:10 crc kubenswrapper[4956]: E0930 06:31:10.350473 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:31:23 crc kubenswrapper[4956]: I0930 06:31:23.340753 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:31:23 crc kubenswrapper[4956]: E0930 06:31:23.342643 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.048999 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5ljhx"] Sep 30 06:31:28 crc kubenswrapper[4956]: E0930 06:31:28.049981 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6831bf62-b6e7-420c-9f69-bce3e2921222" containerName="collect-profiles" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.049995 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="6831bf62-b6e7-420c-9f69-bce3e2921222" containerName="collect-profiles" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.050188 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="6831bf62-b6e7-420c-9f69-bce3e2921222" containerName="collect-profiles" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.051755 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.103242 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ljhx"] Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.189408 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpwb5\" (UniqueName: \"kubernetes.io/projected/7a56af6d-09c5-4ed7-a411-659499c9a1ff-kube-api-access-xpwb5\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.189696 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-utilities\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.190457 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-catalog-content\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.292396 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpwb5\" (UniqueName: \"kubernetes.io/projected/7a56af6d-09c5-4ed7-a411-659499c9a1ff-kube-api-access-xpwb5\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.292834 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-utilities\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.292969 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-catalog-content\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.293720 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-utilities\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.293913 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-catalog-content\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.312395 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpwb5\" (UniqueName: \"kubernetes.io/projected/7a56af6d-09c5-4ed7-a411-659499c9a1ff-kube-api-access-xpwb5\") pod \"certified-operators-5ljhx\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.403874 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:28 crc kubenswrapper[4956]: I0930 06:31:28.968474 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ljhx"] Sep 30 06:31:29 crc kubenswrapper[4956]: I0930 06:31:29.292374 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljhx" event={"ID":"7a56af6d-09c5-4ed7-a411-659499c9a1ff","Type":"ContainerStarted","Data":"705ae3fde6cd7a148465c72ed97c9ece80af42111cac15b7f991c66346fa22ab"} Sep 30 06:31:30 crc kubenswrapper[4956]: I0930 06:31:30.307771 4956 generic.go:334] "Generic (PLEG): container finished" podID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerID="d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695" exitCode=0 Sep 30 06:31:30 crc kubenswrapper[4956]: I0930 06:31:30.307913 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljhx" event={"ID":"7a56af6d-09c5-4ed7-a411-659499c9a1ff","Type":"ContainerDied","Data":"d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695"} Sep 30 06:31:32 crc kubenswrapper[4956]: I0930 06:31:32.341807 4956 generic.go:334] "Generic (PLEG): container finished" podID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerID="2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03" exitCode=0 Sep 30 06:31:32 crc kubenswrapper[4956]: I0930 06:31:32.357716 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljhx" event={"ID":"7a56af6d-09c5-4ed7-a411-659499c9a1ff","Type":"ContainerDied","Data":"2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03"} Sep 30 06:31:33 crc kubenswrapper[4956]: I0930 06:31:33.356355 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljhx" event={"ID":"7a56af6d-09c5-4ed7-a411-659499c9a1ff","Type":"ContainerStarted","Data":"628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8"} Sep 30 06:31:33 crc kubenswrapper[4956]: I0930 06:31:33.394778 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5ljhx" podStartSLOduration=2.9619645759999997 podStartE2EDuration="5.394758313s" podCreationTimestamp="2025-09-30 06:31:28 +0000 UTC" firstStartedPulling="2025-09-30 06:31:30.31303752 +0000 UTC m=+3760.640158055" lastFinishedPulling="2025-09-30 06:31:32.745831257 +0000 UTC m=+3763.072951792" observedRunningTime="2025-09-30 06:31:33.38698594 +0000 UTC m=+3763.714106485" watchObservedRunningTime="2025-09-30 06:31:33.394758313 +0000 UTC m=+3763.721878838" Sep 30 06:31:36 crc kubenswrapper[4956]: I0930 06:31:36.341699 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:31:36 crc kubenswrapper[4956]: E0930 06:31:36.343107 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:31:38 crc kubenswrapper[4956]: I0930 06:31:38.404614 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:38 crc kubenswrapper[4956]: I0930 06:31:38.404837 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:38 crc kubenswrapper[4956]: I0930 06:31:38.451536 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:39 crc kubenswrapper[4956]: I0930 06:31:39.518172 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:39 crc kubenswrapper[4956]: I0930 06:31:39.592280 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ljhx"] Sep 30 06:31:41 crc kubenswrapper[4956]: I0930 06:31:41.455935 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5ljhx" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerName="registry-server" containerID="cri-o://628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8" gracePeriod=2 Sep 30 06:31:41 crc kubenswrapper[4956]: I0930 06:31:41.991999 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.112844 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-catalog-content\") pod \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.112980 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpwb5\" (UniqueName: \"kubernetes.io/projected/7a56af6d-09c5-4ed7-a411-659499c9a1ff-kube-api-access-xpwb5\") pod \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.113171 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-utilities\") pod \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\" (UID: \"7a56af6d-09c5-4ed7-a411-659499c9a1ff\") " Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.115027 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-utilities" (OuterVolumeSpecName: "utilities") pod "7a56af6d-09c5-4ed7-a411-659499c9a1ff" (UID: "7a56af6d-09c5-4ed7-a411-659499c9a1ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.126354 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a56af6d-09c5-4ed7-a411-659499c9a1ff-kube-api-access-xpwb5" (OuterVolumeSpecName: "kube-api-access-xpwb5") pod "7a56af6d-09c5-4ed7-a411-659499c9a1ff" (UID: "7a56af6d-09c5-4ed7-a411-659499c9a1ff"). InnerVolumeSpecName "kube-api-access-xpwb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.199130 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a56af6d-09c5-4ed7-a411-659499c9a1ff" (UID: "7a56af6d-09c5-4ed7-a411-659499c9a1ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.216098 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpwb5\" (UniqueName: \"kubernetes.io/projected/7a56af6d-09c5-4ed7-a411-659499c9a1ff-kube-api-access-xpwb5\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.216521 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.216531 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a56af6d-09c5-4ed7-a411-659499c9a1ff-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.473235 4956 generic.go:334] "Generic (PLEG): container finished" podID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerID="628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8" exitCode=0 Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.473304 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljhx" event={"ID":"7a56af6d-09c5-4ed7-a411-659499c9a1ff","Type":"ContainerDied","Data":"628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8"} Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.473350 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ljhx" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.473405 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljhx" event={"ID":"7a56af6d-09c5-4ed7-a411-659499c9a1ff","Type":"ContainerDied","Data":"705ae3fde6cd7a148465c72ed97c9ece80af42111cac15b7f991c66346fa22ab"} Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.473446 4956 scope.go:117] "RemoveContainer" containerID="628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.493207 4956 scope.go:117] "RemoveContainer" containerID="2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.506505 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ljhx"] Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.517614 4956 scope.go:117] "RemoveContainer" containerID="d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.521217 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5ljhx"] Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.558243 4956 scope.go:117] "RemoveContainer" containerID="628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8" Sep 30 06:31:42 crc kubenswrapper[4956]: E0930 06:31:42.558519 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8\": container with ID starting with 628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8 not found: ID does not exist" containerID="628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.558563 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8"} err="failed to get container status \"628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8\": rpc error: code = NotFound desc = could not find container \"628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8\": container with ID starting with 628a704dd111b21459cbce40480fc047c2a622a8733ee2bec645da5980866af8 not found: ID does not exist" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.558593 4956 scope.go:117] "RemoveContainer" containerID="2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03" Sep 30 06:31:42 crc kubenswrapper[4956]: E0930 06:31:42.558842 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03\": container with ID starting with 2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03 not found: ID does not exist" containerID="2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.558880 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03"} err="failed to get container status \"2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03\": rpc error: code = NotFound desc = could not find container \"2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03\": container with ID starting with 2f45465b698e26c0e97d31d58ed2e4d4a1093a2b0bf780811a66d153ae520b03 not found: ID does not exist" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.558938 4956 scope.go:117] "RemoveContainer" containerID="d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695" Sep 30 06:31:42 crc kubenswrapper[4956]: E0930 06:31:42.559196 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695\": container with ID starting with d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695 not found: ID does not exist" containerID="d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695" Sep 30 06:31:42 crc kubenswrapper[4956]: I0930 06:31:42.559222 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695"} err="failed to get container status \"d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695\": rpc error: code = NotFound desc = could not find container \"d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695\": container with ID starting with d4c2b508d7efd990f73e22c8d24456ffaefbcf73eea8e855e36da750c066e695 not found: ID does not exist" Sep 30 06:31:44 crc kubenswrapper[4956]: I0930 06:31:44.353197 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" path="/var/lib/kubelet/pods/7a56af6d-09c5-4ed7-a411-659499c9a1ff/volumes" Sep 30 06:31:47 crc kubenswrapper[4956]: I0930 06:31:47.341386 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:31:47 crc kubenswrapper[4956]: E0930 06:31:47.341913 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:31:59 crc kubenswrapper[4956]: I0930 06:31:59.343891 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:31:59 crc kubenswrapper[4956]: E0930 06:31:59.344686 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:32:14 crc kubenswrapper[4956]: I0930 06:32:14.341108 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:32:14 crc kubenswrapper[4956]: E0930 06:32:14.342359 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:32:28 crc kubenswrapper[4956]: I0930 06:32:28.341884 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:32:28 crc kubenswrapper[4956]: E0930 06:32:28.342652 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:32:43 crc kubenswrapper[4956]: I0930 06:32:43.342355 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:32:43 crc kubenswrapper[4956]: E0930 06:32:43.343527 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:32:54 crc kubenswrapper[4956]: I0930 06:32:54.341964 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:32:55 crc kubenswrapper[4956]: I0930 06:32:55.257892 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"2c26bbe8aaea8835183c740bf6b7cef2e60d754476c1f108edf64dbad7c17cb7"} Sep 30 06:35:18 crc kubenswrapper[4956]: I0930 06:35:18.074640 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:35:18 crc kubenswrapper[4956]: I0930 06:35:18.075842 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:35:48 crc kubenswrapper[4956]: I0930 06:35:48.073192 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:35:48 crc kubenswrapper[4956]: I0930 06:35:48.073718 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.073847 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.074674 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.074765 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.076204 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c26bbe8aaea8835183c740bf6b7cef2e60d754476c1f108edf64dbad7c17cb7"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.076299 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://2c26bbe8aaea8835183c740bf6b7cef2e60d754476c1f108edf64dbad7c17cb7" gracePeriod=600 Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.556813 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="2c26bbe8aaea8835183c740bf6b7cef2e60d754476c1f108edf64dbad7c17cb7" exitCode=0 Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.556903 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"2c26bbe8aaea8835183c740bf6b7cef2e60d754476c1f108edf64dbad7c17cb7"} Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.557279 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024"} Sep 30 06:36:18 crc kubenswrapper[4956]: I0930 06:36:18.557305 4956 scope.go:117] "RemoveContainer" containerID="826224add61d8ffe2dea2c07fe0c76bdd6cf4a6a340e166774be8eb7ec61964c" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.215491 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vqmw"] Sep 30 06:38:01 crc kubenswrapper[4956]: E0930 06:38:01.217628 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerName="extract-utilities" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.217654 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerName="extract-utilities" Sep 30 06:38:01 crc kubenswrapper[4956]: E0930 06:38:01.217740 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerName="extract-content" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.217747 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerName="extract-content" Sep 30 06:38:01 crc kubenswrapper[4956]: E0930 06:38:01.217765 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerName="registry-server" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.217771 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerName="registry-server" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.217985 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a56af6d-09c5-4ed7-a411-659499c9a1ff" containerName="registry-server" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.220735 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.239438 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vqmw"] Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.365734 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-catalog-content\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.365803 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gnjm\" (UniqueName: \"kubernetes.io/projected/0c497f0e-054f-4e94-94e5-aa6d51c38aba-kube-api-access-9gnjm\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.365907 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-utilities\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.467533 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gnjm\" (UniqueName: \"kubernetes.io/projected/0c497f0e-054f-4e94-94e5-aa6d51c38aba-kube-api-access-9gnjm\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.467654 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-utilities\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.467795 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-catalog-content\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.470008 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-catalog-content\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.470220 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-utilities\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.490304 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gnjm\" (UniqueName: \"kubernetes.io/projected/0c497f0e-054f-4e94-94e5-aa6d51c38aba-kube-api-access-9gnjm\") pod \"community-operators-8vqmw\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:01 crc kubenswrapper[4956]: I0930 06:38:01.554592 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:02 crc kubenswrapper[4956]: I0930 06:38:02.046782 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vqmw"] Sep 30 06:38:02 crc kubenswrapper[4956]: I0930 06:38:02.914617 4956 generic.go:334] "Generic (PLEG): container finished" podID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerID="d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e" exitCode=0 Sep 30 06:38:02 crc kubenswrapper[4956]: I0930 06:38:02.914704 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmw" event={"ID":"0c497f0e-054f-4e94-94e5-aa6d51c38aba","Type":"ContainerDied","Data":"d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e"} Sep 30 06:38:02 crc kubenswrapper[4956]: I0930 06:38:02.914953 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmw" event={"ID":"0c497f0e-054f-4e94-94e5-aa6d51c38aba","Type":"ContainerStarted","Data":"fc7b7fa0ecfa1834c0aa020d8f91ed540cf17a75eab82de9294fe9a7b7f04ab2"} Sep 30 06:38:02 crc kubenswrapper[4956]: I0930 06:38:02.918264 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:38:04 crc kubenswrapper[4956]: I0930 06:38:04.939769 4956 generic.go:334] "Generic (PLEG): container finished" podID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerID="1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c" exitCode=0 Sep 30 06:38:04 crc kubenswrapper[4956]: I0930 06:38:04.940028 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmw" event={"ID":"0c497f0e-054f-4e94-94e5-aa6d51c38aba","Type":"ContainerDied","Data":"1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c"} Sep 30 06:38:05 crc kubenswrapper[4956]: I0930 06:38:05.951203 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmw" event={"ID":"0c497f0e-054f-4e94-94e5-aa6d51c38aba","Type":"ContainerStarted","Data":"2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79"} Sep 30 06:38:05 crc kubenswrapper[4956]: I0930 06:38:05.974356 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vqmw" podStartSLOduration=2.457095496 podStartE2EDuration="4.974328711s" podCreationTimestamp="2025-09-30 06:38:01 +0000 UTC" firstStartedPulling="2025-09-30 06:38:02.917738955 +0000 UTC m=+4153.244859520" lastFinishedPulling="2025-09-30 06:38:05.43497217 +0000 UTC m=+4155.762092735" observedRunningTime="2025-09-30 06:38:05.968368734 +0000 UTC m=+4156.295489279" watchObservedRunningTime="2025-09-30 06:38:05.974328711 +0000 UTC m=+4156.301449276" Sep 30 06:38:11 crc kubenswrapper[4956]: I0930 06:38:11.555035 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:11 crc kubenswrapper[4956]: I0930 06:38:11.555859 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:11 crc kubenswrapper[4956]: I0930 06:38:11.719495 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:12 crc kubenswrapper[4956]: I0930 06:38:12.055339 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:12 crc kubenswrapper[4956]: I0930 06:38:12.108912 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vqmw"] Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.035979 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vqmw" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerName="registry-server" containerID="cri-o://2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79" gracePeriod=2 Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.624085 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.689934 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gnjm\" (UniqueName: \"kubernetes.io/projected/0c497f0e-054f-4e94-94e5-aa6d51c38aba-kube-api-access-9gnjm\") pod \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.690016 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-utilities\") pod \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.690090 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-catalog-content\") pod \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\" (UID: \"0c497f0e-054f-4e94-94e5-aa6d51c38aba\") " Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.691558 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-utilities" (OuterVolumeSpecName: "utilities") pod "0c497f0e-054f-4e94-94e5-aa6d51c38aba" (UID: "0c497f0e-054f-4e94-94e5-aa6d51c38aba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.699715 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c497f0e-054f-4e94-94e5-aa6d51c38aba-kube-api-access-9gnjm" (OuterVolumeSpecName: "kube-api-access-9gnjm") pod "0c497f0e-054f-4e94-94e5-aa6d51c38aba" (UID: "0c497f0e-054f-4e94-94e5-aa6d51c38aba"). InnerVolumeSpecName "kube-api-access-9gnjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.764994 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c497f0e-054f-4e94-94e5-aa6d51c38aba" (UID: "0c497f0e-054f-4e94-94e5-aa6d51c38aba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.795527 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gnjm\" (UniqueName: \"kubernetes.io/projected/0c497f0e-054f-4e94-94e5-aa6d51c38aba-kube-api-access-9gnjm\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.795581 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:14 crc kubenswrapper[4956]: I0930 06:38:14.795601 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c497f0e-054f-4e94-94e5-aa6d51c38aba-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.049096 4956 generic.go:334] "Generic (PLEG): container finished" podID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerID="2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79" exitCode=0 Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.049174 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqmw" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.049179 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmw" event={"ID":"0c497f0e-054f-4e94-94e5-aa6d51c38aba","Type":"ContainerDied","Data":"2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79"} Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.049269 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmw" event={"ID":"0c497f0e-054f-4e94-94e5-aa6d51c38aba","Type":"ContainerDied","Data":"fc7b7fa0ecfa1834c0aa020d8f91ed540cf17a75eab82de9294fe9a7b7f04ab2"} Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.049299 4956 scope.go:117] "RemoveContainer" containerID="2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.097345 4956 scope.go:117] "RemoveContainer" containerID="1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.103209 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vqmw"] Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.119445 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vqmw"] Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.139296 4956 scope.go:117] "RemoveContainer" containerID="d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.194223 4956 scope.go:117] "RemoveContainer" containerID="2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79" Sep 30 06:38:15 crc kubenswrapper[4956]: E0930 06:38:15.194746 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79\": container with ID starting with 2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79 not found: ID does not exist" containerID="2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.194798 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79"} err="failed to get container status \"2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79\": rpc error: code = NotFound desc = could not find container \"2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79\": container with ID starting with 2276ab46fc9bdd8e5321d9c7f00380cb171e0174edf430229fee2fb5f17a9e79 not found: ID does not exist" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.194839 4956 scope.go:117] "RemoveContainer" containerID="1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c" Sep 30 06:38:15 crc kubenswrapper[4956]: E0930 06:38:15.195144 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c\": container with ID starting with 1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c not found: ID does not exist" containerID="1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.195180 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c"} err="failed to get container status \"1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c\": rpc error: code = NotFound desc = could not find container \"1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c\": container with ID starting with 1bdb9c2dd2488977d53d4227c320a776b596a55cde3fc92bd229760eba3b354c not found: ID does not exist" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.195204 4956 scope.go:117] "RemoveContainer" containerID="d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e" Sep 30 06:38:15 crc kubenswrapper[4956]: E0930 06:38:15.195452 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e\": container with ID starting with d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e not found: ID does not exist" containerID="d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e" Sep 30 06:38:15 crc kubenswrapper[4956]: I0930 06:38:15.195481 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e"} err="failed to get container status \"d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e\": rpc error: code = NotFound desc = could not find container \"d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e\": container with ID starting with d33f1d3322304703787f3ce0cfbdc74ed76fae9b5bcae7bc0a2eb4147ef9540e not found: ID does not exist" Sep 30 06:38:16 crc kubenswrapper[4956]: I0930 06:38:16.361064 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" path="/var/lib/kubelet/pods/0c497f0e-054f-4e94-94e5-aa6d51c38aba/volumes" Sep 30 06:38:18 crc kubenswrapper[4956]: I0930 06:38:18.074000 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:38:18 crc kubenswrapper[4956]: I0930 06:38:18.074734 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:38:48 crc kubenswrapper[4956]: I0930 06:38:48.074386 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:38:48 crc kubenswrapper[4956]: I0930 06:38:48.075391 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.834043 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hcbcj"] Sep 30 06:39:16 crc kubenswrapper[4956]: E0930 06:39:16.836425 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerName="extract-content" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.836513 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerName="extract-content" Sep 30 06:39:16 crc kubenswrapper[4956]: E0930 06:39:16.836582 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerName="extract-utilities" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.836636 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerName="extract-utilities" Sep 30 06:39:16 crc kubenswrapper[4956]: E0930 06:39:16.836700 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerName="registry-server" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.836753 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerName="registry-server" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.837071 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c497f0e-054f-4e94-94e5-aa6d51c38aba" containerName="registry-server" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.838801 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.853376 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hcbcj"] Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.929082 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6dt\" (UniqueName: \"kubernetes.io/projected/09129d4b-a35b-40e4-ab03-b72fd1908c9b-kube-api-access-pd6dt\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.929618 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-utilities\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:16 crc kubenswrapper[4956]: I0930 06:39:16.929811 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-catalog-content\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.032647 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-utilities\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.032909 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-catalog-content\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.033034 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6dt\" (UniqueName: \"kubernetes.io/projected/09129d4b-a35b-40e4-ab03-b72fd1908c9b-kube-api-access-pd6dt\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.033680 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-utilities\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.033714 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-catalog-content\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.053093 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6dt\" (UniqueName: \"kubernetes.io/projected/09129d4b-a35b-40e4-ab03-b72fd1908c9b-kube-api-access-pd6dt\") pod \"redhat-operators-hcbcj\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.175565 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.633283 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hcbcj"] Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.988213 4956 generic.go:334] "Generic (PLEG): container finished" podID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerID="d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac" exitCode=0 Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.988272 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcbcj" event={"ID":"09129d4b-a35b-40e4-ab03-b72fd1908c9b","Type":"ContainerDied","Data":"d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac"} Sep 30 06:39:17 crc kubenswrapper[4956]: I0930 06:39:17.988324 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcbcj" event={"ID":"09129d4b-a35b-40e4-ab03-b72fd1908c9b","Type":"ContainerStarted","Data":"a49daf39581495783ea583f2fb94eb4198a00fb7f8c382eb751b40d806ec70e7"} Sep 30 06:39:18 crc kubenswrapper[4956]: I0930 06:39:18.073648 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:39:18 crc kubenswrapper[4956]: I0930 06:39:18.073698 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:39:18 crc kubenswrapper[4956]: I0930 06:39:18.073736 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:39:18 crc kubenswrapper[4956]: I0930 06:39:18.074381 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:39:18 crc kubenswrapper[4956]: I0930 06:39:18.074432 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" gracePeriod=600 Sep 30 06:39:18 crc kubenswrapper[4956]: E0930 06:39:18.210890 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:39:19 crc kubenswrapper[4956]: I0930 06:39:18.999790 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" exitCode=0 Sep 30 06:39:19 crc kubenswrapper[4956]: I0930 06:39:18.999830 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024"} Sep 30 06:39:19 crc kubenswrapper[4956]: I0930 06:39:19.000239 4956 scope.go:117] "RemoveContainer" containerID="2c26bbe8aaea8835183c740bf6b7cef2e60d754476c1f108edf64dbad7c17cb7" Sep 30 06:39:19 crc kubenswrapper[4956]: I0930 06:39:19.000931 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:39:19 crc kubenswrapper[4956]: E0930 06:39:19.001545 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:39:19 crc kubenswrapper[4956]: I0930 06:39:19.003585 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcbcj" event={"ID":"09129d4b-a35b-40e4-ab03-b72fd1908c9b","Type":"ContainerStarted","Data":"93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1"} Sep 30 06:39:21 crc kubenswrapper[4956]: I0930 06:39:21.032938 4956 generic.go:334] "Generic (PLEG): container finished" podID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerID="93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1" exitCode=0 Sep 30 06:39:21 crc kubenswrapper[4956]: I0930 06:39:21.033077 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcbcj" event={"ID":"09129d4b-a35b-40e4-ab03-b72fd1908c9b","Type":"ContainerDied","Data":"93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1"} Sep 30 06:39:23 crc kubenswrapper[4956]: I0930 06:39:23.063472 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcbcj" event={"ID":"09129d4b-a35b-40e4-ab03-b72fd1908c9b","Type":"ContainerStarted","Data":"16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774"} Sep 30 06:39:23 crc kubenswrapper[4956]: I0930 06:39:23.092241 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hcbcj" podStartSLOduration=3.5289031189999998 podStartE2EDuration="7.092216513s" podCreationTimestamp="2025-09-30 06:39:16 +0000 UTC" firstStartedPulling="2025-09-30 06:39:17.990107397 +0000 UTC m=+4228.317227922" lastFinishedPulling="2025-09-30 06:39:21.553420781 +0000 UTC m=+4231.880541316" observedRunningTime="2025-09-30 06:39:23.091325035 +0000 UTC m=+4233.418445580" watchObservedRunningTime="2025-09-30 06:39:23.092216513 +0000 UTC m=+4233.419337048" Sep 30 06:39:27 crc kubenswrapper[4956]: I0930 06:39:27.176316 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:27 crc kubenswrapper[4956]: I0930 06:39:27.176925 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:27 crc kubenswrapper[4956]: I0930 06:39:27.248354 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:28 crc kubenswrapper[4956]: I0930 06:39:28.183462 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:28 crc kubenswrapper[4956]: I0930 06:39:28.256875 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hcbcj"] Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.135089 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hcbcj" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerName="registry-server" containerID="cri-o://16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774" gracePeriod=2 Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.661295 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.760512 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-catalog-content\") pod \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.761477 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-utilities\") pod \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.761575 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6dt\" (UniqueName: \"kubernetes.io/projected/09129d4b-a35b-40e4-ab03-b72fd1908c9b-kube-api-access-pd6dt\") pod \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\" (UID: \"09129d4b-a35b-40e4-ab03-b72fd1908c9b\") " Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.762243 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-utilities" (OuterVolumeSpecName: "utilities") pod "09129d4b-a35b-40e4-ab03-b72fd1908c9b" (UID: "09129d4b-a35b-40e4-ab03-b72fd1908c9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.762591 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.778732 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09129d4b-a35b-40e4-ab03-b72fd1908c9b-kube-api-access-pd6dt" (OuterVolumeSpecName: "kube-api-access-pd6dt") pod "09129d4b-a35b-40e4-ab03-b72fd1908c9b" (UID: "09129d4b-a35b-40e4-ab03-b72fd1908c9b"). InnerVolumeSpecName "kube-api-access-pd6dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.866087 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6dt\" (UniqueName: \"kubernetes.io/projected/09129d4b-a35b-40e4-ab03-b72fd1908c9b-kube-api-access-pd6dt\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.888267 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09129d4b-a35b-40e4-ab03-b72fd1908c9b" (UID: "09129d4b-a35b-40e4-ab03-b72fd1908c9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:30 crc kubenswrapper[4956]: I0930 06:39:30.969885 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09129d4b-a35b-40e4-ab03-b72fd1908c9b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.151878 4956 generic.go:334] "Generic (PLEG): container finished" podID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerID="16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774" exitCode=0 Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.151926 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcbcj" event={"ID":"09129d4b-a35b-40e4-ab03-b72fd1908c9b","Type":"ContainerDied","Data":"16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774"} Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.151958 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hcbcj" event={"ID":"09129d4b-a35b-40e4-ab03-b72fd1908c9b","Type":"ContainerDied","Data":"a49daf39581495783ea583f2fb94eb4198a00fb7f8c382eb751b40d806ec70e7"} Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.151977 4956 scope.go:117] "RemoveContainer" containerID="16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.152082 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hcbcj" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.173001 4956 scope.go:117] "RemoveContainer" containerID="93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.195680 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hcbcj"] Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.205001 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hcbcj"] Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.214273 4956 scope.go:117] "RemoveContainer" containerID="d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.266045 4956 scope.go:117] "RemoveContainer" containerID="16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774" Sep 30 06:39:31 crc kubenswrapper[4956]: E0930 06:39:31.266705 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774\": container with ID starting with 16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774 not found: ID does not exist" containerID="16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.266748 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774"} err="failed to get container status \"16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774\": rpc error: code = NotFound desc = could not find container \"16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774\": container with ID starting with 16bdafba4f734e4e2dde34cd96fa8f5a5d94fcd909fefabb10f50a239b466774 not found: ID does not exist" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.266774 4956 scope.go:117] "RemoveContainer" containerID="93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1" Sep 30 06:39:31 crc kubenswrapper[4956]: E0930 06:39:31.267328 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1\": container with ID starting with 93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1 not found: ID does not exist" containerID="93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.267368 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1"} err="failed to get container status \"93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1\": rpc error: code = NotFound desc = could not find container \"93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1\": container with ID starting with 93fadaec8367de965046de12b0d6198364e8c3d5d9762ec6a07d6de56cf25fb1 not found: ID does not exist" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.267403 4956 scope.go:117] "RemoveContainer" containerID="d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac" Sep 30 06:39:31 crc kubenswrapper[4956]: E0930 06:39:31.267701 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac\": container with ID starting with d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac not found: ID does not exist" containerID="d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac" Sep 30 06:39:31 crc kubenswrapper[4956]: I0930 06:39:31.267727 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac"} err="failed to get container status \"d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac\": rpc error: code = NotFound desc = could not find container \"d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac\": container with ID starting with d3835fb40015c725f9e803c16062a6d47a2e6a5d3765028f6c404a0e210f89ac not found: ID does not exist" Sep 30 06:39:32 crc kubenswrapper[4956]: I0930 06:39:32.340806 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:39:32 crc kubenswrapper[4956]: E0930 06:39:32.341428 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:39:32 crc kubenswrapper[4956]: I0930 06:39:32.355372 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" path="/var/lib/kubelet/pods/09129d4b-a35b-40e4-ab03-b72fd1908c9b/volumes" Sep 30 06:39:45 crc kubenswrapper[4956]: I0930 06:39:45.342419 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:39:45 crc kubenswrapper[4956]: E0930 06:39:45.343610 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:39:57 crc kubenswrapper[4956]: I0930 06:39:57.343327 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:39:57 crc kubenswrapper[4956]: E0930 06:39:57.343938 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:40:09 crc kubenswrapper[4956]: I0930 06:40:09.341052 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:40:09 crc kubenswrapper[4956]: E0930 06:40:09.341971 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:40:23 crc kubenswrapper[4956]: I0930 06:40:23.342368 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:40:23 crc kubenswrapper[4956]: E0930 06:40:23.343657 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:40:38 crc kubenswrapper[4956]: I0930 06:40:38.341220 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:40:38 crc kubenswrapper[4956]: E0930 06:40:38.341921 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:40:51 crc kubenswrapper[4956]: I0930 06:40:51.340918 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:40:51 crc kubenswrapper[4956]: E0930 06:40:51.341592 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:41:03 crc kubenswrapper[4956]: I0930 06:41:03.342105 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:41:03 crc kubenswrapper[4956]: E0930 06:41:03.342920 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:41:14 crc kubenswrapper[4956]: I0930 06:41:14.341086 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:41:14 crc kubenswrapper[4956]: E0930 06:41:14.342018 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:41:25 crc kubenswrapper[4956]: I0930 06:41:25.341518 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:41:25 crc kubenswrapper[4956]: E0930 06:41:25.342565 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:41:35 crc kubenswrapper[4956]: E0930 06:41:35.094054 4956 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:55476->38.102.83.82:43469: write tcp 38.102.83.82:55476->38.102.83.82:43469: write: broken pipe Sep 30 06:41:36 crc kubenswrapper[4956]: I0930 06:41:36.342892 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:41:36 crc kubenswrapper[4956]: E0930 06:41:36.343805 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.141838 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cfbss"] Sep 30 06:41:39 crc kubenswrapper[4956]: E0930 06:41:39.142902 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerName="registry-server" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.142915 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerName="registry-server" Sep 30 06:41:39 crc kubenswrapper[4956]: E0930 06:41:39.142944 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerName="extract-content" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.142950 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerName="extract-content" Sep 30 06:41:39 crc kubenswrapper[4956]: E0930 06:41:39.142989 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerName="extract-utilities" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.142996 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerName="extract-utilities" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.143193 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="09129d4b-a35b-40e4-ab03-b72fd1908c9b" containerName="registry-server" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.144643 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.166727 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cfbss"] Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.271805 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-catalog-content\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.271864 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-utilities\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.271938 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c62fb\" (UniqueName: \"kubernetes.io/projected/a6a61640-95db-40a9-90e1-f0c6261e998d-kube-api-access-c62fb\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.375036 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-catalog-content\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.375084 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-utilities\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.375128 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c62fb\" (UniqueName: \"kubernetes.io/projected/a6a61640-95db-40a9-90e1-f0c6261e998d-kube-api-access-c62fb\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.375829 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-catalog-content\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.375905 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-utilities\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.395325 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c62fb\" (UniqueName: \"kubernetes.io/projected/a6a61640-95db-40a9-90e1-f0c6261e998d-kube-api-access-c62fb\") pod \"certified-operators-cfbss\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:39 crc kubenswrapper[4956]: I0930 06:41:39.509173 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:40 crc kubenswrapper[4956]: I0930 06:41:40.019467 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cfbss"] Sep 30 06:41:40 crc kubenswrapper[4956]: I0930 06:41:40.619408 4956 generic.go:334] "Generic (PLEG): container finished" podID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerID="bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff" exitCode=0 Sep 30 06:41:40 crc kubenswrapper[4956]: I0930 06:41:40.619647 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfbss" event={"ID":"a6a61640-95db-40a9-90e1-f0c6261e998d","Type":"ContainerDied","Data":"bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff"} Sep 30 06:41:40 crc kubenswrapper[4956]: I0930 06:41:40.619917 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfbss" event={"ID":"a6a61640-95db-40a9-90e1-f0c6261e998d","Type":"ContainerStarted","Data":"27bea69320586e087f5117159c354c89ea0a151ddbad4f4cc29bd990e34cb5f8"} Sep 30 06:41:41 crc kubenswrapper[4956]: I0930 06:41:41.629855 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfbss" event={"ID":"a6a61640-95db-40a9-90e1-f0c6261e998d","Type":"ContainerStarted","Data":"f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279"} Sep 30 06:41:42 crc kubenswrapper[4956]: I0930 06:41:42.642997 4956 generic.go:334] "Generic (PLEG): container finished" podID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerID="f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279" exitCode=0 Sep 30 06:41:42 crc kubenswrapper[4956]: I0930 06:41:42.643056 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfbss" event={"ID":"a6a61640-95db-40a9-90e1-f0c6261e998d","Type":"ContainerDied","Data":"f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279"} Sep 30 06:41:43 crc kubenswrapper[4956]: I0930 06:41:43.655374 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfbss" event={"ID":"a6a61640-95db-40a9-90e1-f0c6261e998d","Type":"ContainerStarted","Data":"2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca"} Sep 30 06:41:43 crc kubenswrapper[4956]: I0930 06:41:43.672249 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cfbss" podStartSLOduration=2.187210395 podStartE2EDuration="4.672227549s" podCreationTimestamp="2025-09-30 06:41:39 +0000 UTC" firstStartedPulling="2025-09-30 06:41:40.622571692 +0000 UTC m=+4370.949692257" lastFinishedPulling="2025-09-30 06:41:43.107588886 +0000 UTC m=+4373.434709411" observedRunningTime="2025-09-30 06:41:43.670077522 +0000 UTC m=+4373.997198067" watchObservedRunningTime="2025-09-30 06:41:43.672227549 +0000 UTC m=+4373.999348114" Sep 30 06:41:49 crc kubenswrapper[4956]: I0930 06:41:49.509381 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:49 crc kubenswrapper[4956]: I0930 06:41:49.512742 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:49 crc kubenswrapper[4956]: I0930 06:41:49.591570 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:49 crc kubenswrapper[4956]: I0930 06:41:49.786292 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:49 crc kubenswrapper[4956]: I0930 06:41:49.852779 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cfbss"] Sep 30 06:41:50 crc kubenswrapper[4956]: I0930 06:41:50.361658 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:41:50 crc kubenswrapper[4956]: E0930 06:41:50.362294 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:41:51 crc kubenswrapper[4956]: I0930 06:41:51.750864 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cfbss" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerName="registry-server" containerID="cri-o://2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca" gracePeriod=2 Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.301849 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.392978 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-catalog-content\") pod \"a6a61640-95db-40a9-90e1-f0c6261e998d\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.393232 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c62fb\" (UniqueName: \"kubernetes.io/projected/a6a61640-95db-40a9-90e1-f0c6261e998d-kube-api-access-c62fb\") pod \"a6a61640-95db-40a9-90e1-f0c6261e998d\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.393275 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-utilities\") pod \"a6a61640-95db-40a9-90e1-f0c6261e998d\" (UID: \"a6a61640-95db-40a9-90e1-f0c6261e998d\") " Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.394135 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-utilities" (OuterVolumeSpecName: "utilities") pod "a6a61640-95db-40a9-90e1-f0c6261e998d" (UID: "a6a61640-95db-40a9-90e1-f0c6261e998d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.408368 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a61640-95db-40a9-90e1-f0c6261e998d-kube-api-access-c62fb" (OuterVolumeSpecName: "kube-api-access-c62fb") pod "a6a61640-95db-40a9-90e1-f0c6261e998d" (UID: "a6a61640-95db-40a9-90e1-f0c6261e998d"). InnerVolumeSpecName "kube-api-access-c62fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.439658 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6a61640-95db-40a9-90e1-f0c6261e998d" (UID: "a6a61640-95db-40a9-90e1-f0c6261e998d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.496588 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.496626 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c62fb\" (UniqueName: \"kubernetes.io/projected/a6a61640-95db-40a9-90e1-f0c6261e998d-kube-api-access-c62fb\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.496642 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6a61640-95db-40a9-90e1-f0c6261e998d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.765469 4956 generic.go:334] "Generic (PLEG): container finished" podID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerID="2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca" exitCode=0 Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.765524 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfbss" event={"ID":"a6a61640-95db-40a9-90e1-f0c6261e998d","Type":"ContainerDied","Data":"2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca"} Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.765562 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cfbss" event={"ID":"a6a61640-95db-40a9-90e1-f0c6261e998d","Type":"ContainerDied","Data":"27bea69320586e087f5117159c354c89ea0a151ddbad4f4cc29bd990e34cb5f8"} Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.765602 4956 scope.go:117] "RemoveContainer" containerID="2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.765629 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cfbss" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.796405 4956 scope.go:117] "RemoveContainer" containerID="f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.829620 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cfbss"] Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.846406 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cfbss"] Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.854164 4956 scope.go:117] "RemoveContainer" containerID="bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.890531 4956 scope.go:117] "RemoveContainer" containerID="2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca" Sep 30 06:41:52 crc kubenswrapper[4956]: E0930 06:41:52.891678 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca\": container with ID starting with 2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca not found: ID does not exist" containerID="2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.891731 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca"} err="failed to get container status \"2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca\": rpc error: code = NotFound desc = could not find container \"2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca\": container with ID starting with 2808145e29a8d837e2604906ffd891569a2aeaba1ebccf9f1229c2758a398bca not found: ID does not exist" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.891763 4956 scope.go:117] "RemoveContainer" containerID="f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279" Sep 30 06:41:52 crc kubenswrapper[4956]: E0930 06:41:52.892291 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279\": container with ID starting with f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279 not found: ID does not exist" containerID="f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.892322 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279"} err="failed to get container status \"f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279\": rpc error: code = NotFound desc = could not find container \"f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279\": container with ID starting with f1cf773127add31040981f7bbf4dd717c3d1d1eaa7e3bd52f5ee6babc389b279 not found: ID does not exist" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.892340 4956 scope.go:117] "RemoveContainer" containerID="bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff" Sep 30 06:41:52 crc kubenswrapper[4956]: E0930 06:41:52.892814 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff\": container with ID starting with bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff not found: ID does not exist" containerID="bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff" Sep 30 06:41:52 crc kubenswrapper[4956]: I0930 06:41:52.892898 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff"} err="failed to get container status \"bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff\": rpc error: code = NotFound desc = could not find container \"bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff\": container with ID starting with bb22ea4a9bd8b10c4c6d957845a0e18695f16c823cf178aa359774bd72f0f1ff not found: ID does not exist" Sep 30 06:41:54 crc kubenswrapper[4956]: I0930 06:41:54.365624 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" path="/var/lib/kubelet/pods/a6a61640-95db-40a9-90e1-f0c6261e998d/volumes" Sep 30 06:42:03 crc kubenswrapper[4956]: I0930 06:42:03.342026 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:42:03 crc kubenswrapper[4956]: E0930 06:42:03.342818 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:42:17 crc kubenswrapper[4956]: I0930 06:42:17.343311 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:42:17 crc kubenswrapper[4956]: E0930 06:42:17.344102 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:42:29 crc kubenswrapper[4956]: I0930 06:42:29.341214 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:42:29 crc kubenswrapper[4956]: E0930 06:42:29.341929 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:42:43 crc kubenswrapper[4956]: I0930 06:42:43.341110 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:42:43 crc kubenswrapper[4956]: E0930 06:42:43.342320 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:42:57 crc kubenswrapper[4956]: I0930 06:42:57.341734 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:42:57 crc kubenswrapper[4956]: E0930 06:42:57.343422 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:43:12 crc kubenswrapper[4956]: I0930 06:43:12.343627 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:43:12 crc kubenswrapper[4956]: E0930 06:43:12.345081 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:43:26 crc kubenswrapper[4956]: I0930 06:43:26.342459 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:43:26 crc kubenswrapper[4956]: E0930 06:43:26.344081 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:43:40 crc kubenswrapper[4956]: I0930 06:43:40.356850 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:43:40 crc kubenswrapper[4956]: E0930 06:43:40.357844 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:43:52 crc kubenswrapper[4956]: I0930 06:43:52.342921 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:43:52 crc kubenswrapper[4956]: E0930 06:43:52.359102 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:44:03 crc kubenswrapper[4956]: I0930 06:44:03.342548 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:44:03 crc kubenswrapper[4956]: E0930 06:44:03.343992 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:44:14 crc kubenswrapper[4956]: I0930 06:44:14.341456 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:44:14 crc kubenswrapper[4956]: E0930 06:44:14.342315 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:44:28 crc kubenswrapper[4956]: I0930 06:44:28.341645 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:44:28 crc kubenswrapper[4956]: I0930 06:44:28.850417 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"e4217fa75f3f22deeaf4dead5680fd7c085512753ddfc7c30a460219d8671021"} Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.199705 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r"] Sep 30 06:45:00 crc kubenswrapper[4956]: E0930 06:45:00.209674 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerName="extract-utilities" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.209719 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerName="extract-utilities" Sep 30 06:45:00 crc kubenswrapper[4956]: E0930 06:45:00.209754 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerName="extract-content" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.209763 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerName="extract-content" Sep 30 06:45:00 crc kubenswrapper[4956]: E0930 06:45:00.209787 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerName="registry-server" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.209795 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerName="registry-server" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.210061 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a61640-95db-40a9-90e1-f0c6261e998d" containerName="registry-server" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.211066 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.216836 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.221546 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.228077 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r"] Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.308255 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/032342af-9d02-4813-8925-a6e97a0f69d4-config-volume\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.308936 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/032342af-9d02-4813-8925-a6e97a0f69d4-secret-volume\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.309848 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz7v6\" (UniqueName: \"kubernetes.io/projected/032342af-9d02-4813-8925-a6e97a0f69d4-kube-api-access-gz7v6\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.412328 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/032342af-9d02-4813-8925-a6e97a0f69d4-secret-volume\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.412406 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz7v6\" (UniqueName: \"kubernetes.io/projected/032342af-9d02-4813-8925-a6e97a0f69d4-kube-api-access-gz7v6\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.412474 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/032342af-9d02-4813-8925-a6e97a0f69d4-config-volume\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.413642 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/032342af-9d02-4813-8925-a6e97a0f69d4-config-volume\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.423815 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/032342af-9d02-4813-8925-a6e97a0f69d4-secret-volume\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.440344 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz7v6\" (UniqueName: \"kubernetes.io/projected/032342af-9d02-4813-8925-a6e97a0f69d4-kube-api-access-gz7v6\") pod \"collect-profiles-29320245-4vm6r\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:00 crc kubenswrapper[4956]: I0930 06:45:00.563106 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:01 crc kubenswrapper[4956]: I0930 06:45:01.056171 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r"] Sep 30 06:45:01 crc kubenswrapper[4956]: W0930 06:45:01.071772 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod032342af_9d02_4813_8925_a6e97a0f69d4.slice/crio-d14637cc92ab7e4171060cd19c215db5de63265067fd11ac233e2236ce903281 WatchSource:0}: Error finding container d14637cc92ab7e4171060cd19c215db5de63265067fd11ac233e2236ce903281: Status 404 returned error can't find the container with id d14637cc92ab7e4171060cd19c215db5de63265067fd11ac233e2236ce903281 Sep 30 06:45:01 crc kubenswrapper[4956]: I0930 06:45:01.332716 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" event={"ID":"032342af-9d02-4813-8925-a6e97a0f69d4","Type":"ContainerStarted","Data":"6949407cbe47a208dbc4c320741fd0ba3afc99f5c779e593f7acdbd2b09a082a"} Sep 30 06:45:01 crc kubenswrapper[4956]: I0930 06:45:01.333257 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" event={"ID":"032342af-9d02-4813-8925-a6e97a0f69d4","Type":"ContainerStarted","Data":"d14637cc92ab7e4171060cd19c215db5de63265067fd11ac233e2236ce903281"} Sep 30 06:45:01 crc kubenswrapper[4956]: I0930 06:45:01.358268 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" podStartSLOduration=1.358238126 podStartE2EDuration="1.358238126s" podCreationTimestamp="2025-09-30 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:45:01.349947086 +0000 UTC m=+4571.677067621" watchObservedRunningTime="2025-09-30 06:45:01.358238126 +0000 UTC m=+4571.685358661" Sep 30 06:45:02 crc kubenswrapper[4956]: I0930 06:45:02.347394 4956 generic.go:334] "Generic (PLEG): container finished" podID="032342af-9d02-4813-8925-a6e97a0f69d4" containerID="6949407cbe47a208dbc4c320741fd0ba3afc99f5c779e593f7acdbd2b09a082a" exitCode=0 Sep 30 06:45:02 crc kubenswrapper[4956]: I0930 06:45:02.382463 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" event={"ID":"032342af-9d02-4813-8925-a6e97a0f69d4","Type":"ContainerDied","Data":"6949407cbe47a208dbc4c320741fd0ba3afc99f5c779e593f7acdbd2b09a082a"} Sep 30 06:45:03 crc kubenswrapper[4956]: I0930 06:45:03.865443 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:03 crc kubenswrapper[4956]: I0930 06:45:03.896721 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz7v6\" (UniqueName: \"kubernetes.io/projected/032342af-9d02-4813-8925-a6e97a0f69d4-kube-api-access-gz7v6\") pod \"032342af-9d02-4813-8925-a6e97a0f69d4\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " Sep 30 06:45:03 crc kubenswrapper[4956]: I0930 06:45:03.897311 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/032342af-9d02-4813-8925-a6e97a0f69d4-config-volume\") pod \"032342af-9d02-4813-8925-a6e97a0f69d4\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " Sep 30 06:45:03 crc kubenswrapper[4956]: I0930 06:45:03.897392 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/032342af-9d02-4813-8925-a6e97a0f69d4-secret-volume\") pod \"032342af-9d02-4813-8925-a6e97a0f69d4\" (UID: \"032342af-9d02-4813-8925-a6e97a0f69d4\") " Sep 30 06:45:03 crc kubenswrapper[4956]: I0930 06:45:03.898286 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/032342af-9d02-4813-8925-a6e97a0f69d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "032342af-9d02-4813-8925-a6e97a0f69d4" (UID: "032342af-9d02-4813-8925-a6e97a0f69d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:45:03 crc kubenswrapper[4956]: I0930 06:45:03.924727 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032342af-9d02-4813-8925-a6e97a0f69d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "032342af-9d02-4813-8925-a6e97a0f69d4" (UID: "032342af-9d02-4813-8925-a6e97a0f69d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:45:03 crc kubenswrapper[4956]: I0930 06:45:03.925092 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032342af-9d02-4813-8925-a6e97a0f69d4-kube-api-access-gz7v6" (OuterVolumeSpecName: "kube-api-access-gz7v6") pod "032342af-9d02-4813-8925-a6e97a0f69d4" (UID: "032342af-9d02-4813-8925-a6e97a0f69d4"). InnerVolumeSpecName "kube-api-access-gz7v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:45:04 crc kubenswrapper[4956]: I0930 06:45:04.000053 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/032342af-9d02-4813-8925-a6e97a0f69d4-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:04 crc kubenswrapper[4956]: I0930 06:45:04.000105 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/032342af-9d02-4813-8925-a6e97a0f69d4-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:04 crc kubenswrapper[4956]: I0930 06:45:04.000148 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz7v6\" (UniqueName: \"kubernetes.io/projected/032342af-9d02-4813-8925-a6e97a0f69d4-kube-api-access-gz7v6\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:04 crc kubenswrapper[4956]: I0930 06:45:04.400681 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" event={"ID":"032342af-9d02-4813-8925-a6e97a0f69d4","Type":"ContainerDied","Data":"d14637cc92ab7e4171060cd19c215db5de63265067fd11ac233e2236ce903281"} Sep 30 06:45:04 crc kubenswrapper[4956]: I0930 06:45:04.400770 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d14637cc92ab7e4171060cd19c215db5de63265067fd11ac233e2236ce903281" Sep 30 06:45:04 crc kubenswrapper[4956]: I0930 06:45:04.400888 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-4vm6r" Sep 30 06:45:04 crc kubenswrapper[4956]: I0930 06:45:04.459127 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8"] Sep 30 06:45:04 crc kubenswrapper[4956]: I0930 06:45:04.469328 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320200-rc9c8"] Sep 30 06:45:06 crc kubenswrapper[4956]: I0930 06:45:06.355807 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719ab0a4-b97f-412a-8b05-52ae64f6d995" path="/var/lib/kubelet/pods/719ab0a4-b97f-412a-8b05-52ae64f6d995/volumes" Sep 30 06:45:15 crc kubenswrapper[4956]: I0930 06:45:15.397188 4956 scope.go:117] "RemoveContainer" containerID="a947d57ffef82fb533888f469c314643f902dcaf76d7a1676b840d17cbab3197" Sep 30 06:45:56 crc kubenswrapper[4956]: E0930 06:45:56.179223 4956 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:36340->38.102.83.82:43469: write tcp 38.102.83.82:36340->38.102.83.82:43469: write: broken pipe Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.585292 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4mt8z"] Sep 30 06:46:28 crc kubenswrapper[4956]: E0930 06:46:28.586890 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032342af-9d02-4813-8925-a6e97a0f69d4" containerName="collect-profiles" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.586914 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="032342af-9d02-4813-8925-a6e97a0f69d4" containerName="collect-profiles" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.587357 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="032342af-9d02-4813-8925-a6e97a0f69d4" containerName="collect-profiles" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.590284 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.604835 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mt8z"] Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.677005 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-utilities\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.677278 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82sm4\" (UniqueName: \"kubernetes.io/projected/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-kube-api-access-82sm4\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.677332 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-catalog-content\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.779536 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82sm4\" (UniqueName: \"kubernetes.io/projected/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-kube-api-access-82sm4\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.779629 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-catalog-content\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.779697 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-utilities\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.780136 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-catalog-content\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.780154 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-utilities\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.801081 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82sm4\" (UniqueName: \"kubernetes.io/projected/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-kube-api-access-82sm4\") pod \"redhat-marketplace-4mt8z\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:28 crc kubenswrapper[4956]: I0930 06:46:28.935968 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:29 crc kubenswrapper[4956]: I0930 06:46:29.435994 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mt8z"] Sep 30 06:46:29 crc kubenswrapper[4956]: I0930 06:46:29.478943 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mt8z" event={"ID":"73488d70-8ddf-4cf6-8fbf-74c0e45e3165","Type":"ContainerStarted","Data":"5a719e177f97c7063729c58ae9fdefccaf3a61d8d74464bd435325ea1f88f253"} Sep 30 06:46:30 crc kubenswrapper[4956]: I0930 06:46:30.497571 4956 generic.go:334] "Generic (PLEG): container finished" podID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerID="47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b" exitCode=0 Sep 30 06:46:30 crc kubenswrapper[4956]: I0930 06:46:30.497654 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mt8z" event={"ID":"73488d70-8ddf-4cf6-8fbf-74c0e45e3165","Type":"ContainerDied","Data":"47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b"} Sep 30 06:46:30 crc kubenswrapper[4956]: I0930 06:46:30.507053 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:46:32 crc kubenswrapper[4956]: I0930 06:46:32.532190 4956 generic.go:334] "Generic (PLEG): container finished" podID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerID="a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f" exitCode=0 Sep 30 06:46:32 crc kubenswrapper[4956]: I0930 06:46:32.532427 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mt8z" event={"ID":"73488d70-8ddf-4cf6-8fbf-74c0e45e3165","Type":"ContainerDied","Data":"a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f"} Sep 30 06:46:33 crc kubenswrapper[4956]: I0930 06:46:33.566063 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mt8z" event={"ID":"73488d70-8ddf-4cf6-8fbf-74c0e45e3165","Type":"ContainerStarted","Data":"a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf"} Sep 30 06:46:33 crc kubenswrapper[4956]: I0930 06:46:33.597832 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4mt8z" podStartSLOduration=3.175240644 podStartE2EDuration="5.59781428s" podCreationTimestamp="2025-09-30 06:46:28 +0000 UTC" firstStartedPulling="2025-09-30 06:46:30.506324582 +0000 UTC m=+4660.833445137" lastFinishedPulling="2025-09-30 06:46:32.928898238 +0000 UTC m=+4663.256018773" observedRunningTime="2025-09-30 06:46:33.597720237 +0000 UTC m=+4663.924840792" watchObservedRunningTime="2025-09-30 06:46:33.59781428 +0000 UTC m=+4663.924934815" Sep 30 06:46:38 crc kubenswrapper[4956]: I0930 06:46:38.936623 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:38 crc kubenswrapper[4956]: I0930 06:46:38.939088 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:39 crc kubenswrapper[4956]: I0930 06:46:39.010996 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:40 crc kubenswrapper[4956]: I0930 06:46:40.436821 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:40 crc kubenswrapper[4956]: I0930 06:46:40.513513 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mt8z"] Sep 30 06:46:41 crc kubenswrapper[4956]: I0930 06:46:41.671840 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4mt8z" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerName="registry-server" containerID="cri-o://a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf" gracePeriod=2 Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.255149 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.400195 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-utilities\") pod \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.400568 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82sm4\" (UniqueName: \"kubernetes.io/projected/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-kube-api-access-82sm4\") pod \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.400768 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-catalog-content\") pod \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\" (UID: \"73488d70-8ddf-4cf6-8fbf-74c0e45e3165\") " Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.401060 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-utilities" (OuterVolumeSpecName: "utilities") pod "73488d70-8ddf-4cf6-8fbf-74c0e45e3165" (UID: "73488d70-8ddf-4cf6-8fbf-74c0e45e3165"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.401431 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.410286 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-kube-api-access-82sm4" (OuterVolumeSpecName: "kube-api-access-82sm4") pod "73488d70-8ddf-4cf6-8fbf-74c0e45e3165" (UID: "73488d70-8ddf-4cf6-8fbf-74c0e45e3165"). InnerVolumeSpecName "kube-api-access-82sm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.429228 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73488d70-8ddf-4cf6-8fbf-74c0e45e3165" (UID: "73488d70-8ddf-4cf6-8fbf-74c0e45e3165"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.503534 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82sm4\" (UniqueName: \"kubernetes.io/projected/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-kube-api-access-82sm4\") on node \"crc\" DevicePath \"\"" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.503604 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73488d70-8ddf-4cf6-8fbf-74c0e45e3165-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.688781 4956 generic.go:334] "Generic (PLEG): container finished" podID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerID="a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf" exitCode=0 Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.688868 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mt8z" event={"ID":"73488d70-8ddf-4cf6-8fbf-74c0e45e3165","Type":"ContainerDied","Data":"a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf"} Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.688953 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mt8z" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.690721 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mt8z" event={"ID":"73488d70-8ddf-4cf6-8fbf-74c0e45e3165","Type":"ContainerDied","Data":"5a719e177f97c7063729c58ae9fdefccaf3a61d8d74464bd435325ea1f88f253"} Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.690751 4956 scope.go:117] "RemoveContainer" containerID="a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.742826 4956 scope.go:117] "RemoveContainer" containerID="a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f" Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.765135 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mt8z"] Sep 30 06:46:42 crc kubenswrapper[4956]: I0930 06:46:42.777170 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mt8z"] Sep 30 06:46:43 crc kubenswrapper[4956]: I0930 06:46:43.587722 4956 scope.go:117] "RemoveContainer" containerID="47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b" Sep 30 06:46:43 crc kubenswrapper[4956]: I0930 06:46:43.617224 4956 scope.go:117] "RemoveContainer" containerID="a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf" Sep 30 06:46:43 crc kubenswrapper[4956]: E0930 06:46:43.617650 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf\": container with ID starting with a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf not found: ID does not exist" containerID="a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf" Sep 30 06:46:43 crc kubenswrapper[4956]: I0930 06:46:43.617697 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf"} err="failed to get container status \"a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf\": rpc error: code = NotFound desc = could not find container \"a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf\": container with ID starting with a58595b32638db236a5bed9718ff94a319b4d72a1f2c52aba27eaeae92f54cdf not found: ID does not exist" Sep 30 06:46:43 crc kubenswrapper[4956]: I0930 06:46:43.617726 4956 scope.go:117] "RemoveContainer" containerID="a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f" Sep 30 06:46:43 crc kubenswrapper[4956]: E0930 06:46:43.618462 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f\": container with ID starting with a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f not found: ID does not exist" containerID="a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f" Sep 30 06:46:43 crc kubenswrapper[4956]: I0930 06:46:43.618487 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f"} err="failed to get container status \"a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f\": rpc error: code = NotFound desc = could not find container \"a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f\": container with ID starting with a3fee0e39b451e3bdf66feec270db9260eec5359d614cfe27b98a2f3c843447f not found: ID does not exist" Sep 30 06:46:43 crc kubenswrapper[4956]: I0930 06:46:43.618501 4956 scope.go:117] "RemoveContainer" containerID="47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b" Sep 30 06:46:43 crc kubenswrapper[4956]: E0930 06:46:43.619188 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b\": container with ID starting with 47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b not found: ID does not exist" containerID="47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b" Sep 30 06:46:43 crc kubenswrapper[4956]: I0930 06:46:43.619275 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b"} err="failed to get container status \"47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b\": rpc error: code = NotFound desc = could not find container \"47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b\": container with ID starting with 47d72e79a27bac13ce7c82575448400be2d0a2f355d8acf043844e7c7895620b not found: ID does not exist" Sep 30 06:46:44 crc kubenswrapper[4956]: I0930 06:46:44.353705 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" path="/var/lib/kubelet/pods/73488d70-8ddf-4cf6-8fbf-74c0e45e3165/volumes" Sep 30 06:46:48 crc kubenswrapper[4956]: I0930 06:46:48.073642 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:46:48 crc kubenswrapper[4956]: I0930 06:46:48.074820 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:47:18 crc kubenswrapper[4956]: I0930 06:47:18.073886 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:47:18 crc kubenswrapper[4956]: I0930 06:47:18.075226 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.073875 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.074908 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.074999 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.076453 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4217fa75f3f22deeaf4dead5680fd7c085512753ddfc7c30a460219d8671021"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.076554 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://e4217fa75f3f22deeaf4dead5680fd7c085512753ddfc7c30a460219d8671021" gracePeriod=600 Sep 30 06:47:48 crc kubenswrapper[4956]: E0930 06:47:48.288412 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ecd015b_e216_40d8_ae78_711b2a65c193.slice/crio-e4217fa75f3f22deeaf4dead5680fd7c085512753ddfc7c30a460219d8671021.scope\": RecentStats: unable to find data in memory cache]" Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.577670 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="e4217fa75f3f22deeaf4dead5680fd7c085512753ddfc7c30a460219d8671021" exitCode=0 Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.577749 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"e4217fa75f3f22deeaf4dead5680fd7c085512753ddfc7c30a460219d8671021"} Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.577796 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925"} Sep 30 06:47:48 crc kubenswrapper[4956]: I0930 06:47:48.577823 4956 scope.go:117] "RemoveContainer" containerID="65d14debee979b59c90de8eb18a28a8c9a44f5b4a92f8d2548b8171a1f02e024" Sep 30 06:49:24 crc kubenswrapper[4956]: E0930 06:49:24.087991 4956 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.82:59166->38.102.83.82:43469: read tcp 38.102.83.82:59166->38.102.83.82:43469: read: connection reset by peer Sep 30 06:49:24 crc kubenswrapper[4956]: E0930 06:49:24.089031 4956 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:59166->38.102.83.82:43469: write tcp 38.102.83.82:59166->38.102.83.82:43469: write: broken pipe Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.288937 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6lkvm"] Sep 30 06:49:35 crc kubenswrapper[4956]: E0930 06:49:35.290748 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerName="extract-utilities" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.290766 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerName="extract-utilities" Sep 30 06:49:35 crc kubenswrapper[4956]: E0930 06:49:35.290803 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerName="registry-server" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.290812 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerName="registry-server" Sep 30 06:49:35 crc kubenswrapper[4956]: E0930 06:49:35.290832 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerName="extract-content" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.290838 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerName="extract-content" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.291358 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="73488d70-8ddf-4cf6-8fbf-74c0e45e3165" containerName="registry-server" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.295248 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.315292 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lkvm"] Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.398731 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-utilities\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.398841 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5p2t\" (UniqueName: \"kubernetes.io/projected/4b7aa45a-7508-4940-a8f5-fa328fef90b8-kube-api-access-n5p2t\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.399178 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-catalog-content\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.500478 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-catalog-content\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.500584 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-utilities\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.500641 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5p2t\" (UniqueName: \"kubernetes.io/projected/4b7aa45a-7508-4940-a8f5-fa328fef90b8-kube-api-access-n5p2t\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.501815 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-catalog-content\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.501870 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-utilities\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.525903 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5p2t\" (UniqueName: \"kubernetes.io/projected/4b7aa45a-7508-4940-a8f5-fa328fef90b8-kube-api-access-n5p2t\") pod \"community-operators-6lkvm\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:35 crc kubenswrapper[4956]: I0930 06:49:35.636572 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:36 crc kubenswrapper[4956]: I0930 06:49:36.234150 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lkvm"] Sep 30 06:49:36 crc kubenswrapper[4956]: I0930 06:49:36.975959 4956 generic.go:334] "Generic (PLEG): container finished" podID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerID="2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb" exitCode=0 Sep 30 06:49:36 crc kubenswrapper[4956]: I0930 06:49:36.976070 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lkvm" event={"ID":"4b7aa45a-7508-4940-a8f5-fa328fef90b8","Type":"ContainerDied","Data":"2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb"} Sep 30 06:49:36 crc kubenswrapper[4956]: I0930 06:49:36.976400 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lkvm" event={"ID":"4b7aa45a-7508-4940-a8f5-fa328fef90b8","Type":"ContainerStarted","Data":"3d191419e93146ab2e3fae95a4e7b6de1aecfebe1756b702da2027c0d804218b"} Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.228286 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vwzbm"] Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.231167 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.255768 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwzbm"] Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.365619 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-utilities\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.365819 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgfg\" (UniqueName: \"kubernetes.io/projected/c37bdfbc-3134-48bd-be57-e98d3707d752-kube-api-access-6hgfg\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.365915 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-catalog-content\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.468415 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-catalog-content\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.468568 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-utilities\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.468698 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgfg\" (UniqueName: \"kubernetes.io/projected/c37bdfbc-3134-48bd-be57-e98d3707d752-kube-api-access-6hgfg\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.470210 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-utilities\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.470212 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-catalog-content\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.492306 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgfg\" (UniqueName: \"kubernetes.io/projected/c37bdfbc-3134-48bd-be57-e98d3707d752-kube-api-access-6hgfg\") pod \"redhat-operators-vwzbm\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:38 crc kubenswrapper[4956]: I0930 06:49:38.569363 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:39 crc kubenswrapper[4956]: I0930 06:49:39.039859 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lkvm" event={"ID":"4b7aa45a-7508-4940-a8f5-fa328fef90b8","Type":"ContainerStarted","Data":"3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109"} Sep 30 06:49:39 crc kubenswrapper[4956]: W0930 06:49:39.062676 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc37bdfbc_3134_48bd_be57_e98d3707d752.slice/crio-307b29cd7b107ea088729d80e356ad3eeee53275f01ee3da1ef63bb7a1664172 WatchSource:0}: Error finding container 307b29cd7b107ea088729d80e356ad3eeee53275f01ee3da1ef63bb7a1664172: Status 404 returned error can't find the container with id 307b29cd7b107ea088729d80e356ad3eeee53275f01ee3da1ef63bb7a1664172 Sep 30 06:49:39 crc kubenswrapper[4956]: I0930 06:49:39.064341 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwzbm"] Sep 30 06:49:40 crc kubenswrapper[4956]: I0930 06:49:40.059616 4956 generic.go:334] "Generic (PLEG): container finished" podID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerID="ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa" exitCode=0 Sep 30 06:49:40 crc kubenswrapper[4956]: I0930 06:49:40.059713 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwzbm" event={"ID":"c37bdfbc-3134-48bd-be57-e98d3707d752","Type":"ContainerDied","Data":"ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa"} Sep 30 06:49:40 crc kubenswrapper[4956]: I0930 06:49:40.060911 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwzbm" event={"ID":"c37bdfbc-3134-48bd-be57-e98d3707d752","Type":"ContainerStarted","Data":"307b29cd7b107ea088729d80e356ad3eeee53275f01ee3da1ef63bb7a1664172"} Sep 30 06:49:40 crc kubenswrapper[4956]: I0930 06:49:40.066826 4956 generic.go:334] "Generic (PLEG): container finished" podID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerID="3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109" exitCode=0 Sep 30 06:49:40 crc kubenswrapper[4956]: I0930 06:49:40.066987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lkvm" event={"ID":"4b7aa45a-7508-4940-a8f5-fa328fef90b8","Type":"ContainerDied","Data":"3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109"} Sep 30 06:49:41 crc kubenswrapper[4956]: I0930 06:49:41.085311 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lkvm" event={"ID":"4b7aa45a-7508-4940-a8f5-fa328fef90b8","Type":"ContainerStarted","Data":"93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828"} Sep 30 06:49:41 crc kubenswrapper[4956]: I0930 06:49:41.116724 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6lkvm" podStartSLOduration=2.493679897 podStartE2EDuration="6.116694683s" podCreationTimestamp="2025-09-30 06:49:35 +0000 UTC" firstStartedPulling="2025-09-30 06:49:36.978189973 +0000 UTC m=+4847.305310528" lastFinishedPulling="2025-09-30 06:49:40.601204759 +0000 UTC m=+4850.928325314" observedRunningTime="2025-09-30 06:49:41.113609276 +0000 UTC m=+4851.440729811" watchObservedRunningTime="2025-09-30 06:49:41.116694683 +0000 UTC m=+4851.443815208" Sep 30 06:49:42 crc kubenswrapper[4956]: I0930 06:49:42.103597 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwzbm" event={"ID":"c37bdfbc-3134-48bd-be57-e98d3707d752","Type":"ContainerStarted","Data":"8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6"} Sep 30 06:49:43 crc kubenswrapper[4956]: I0930 06:49:43.118757 4956 generic.go:334] "Generic (PLEG): container finished" podID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerID="8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6" exitCode=0 Sep 30 06:49:43 crc kubenswrapper[4956]: I0930 06:49:43.118835 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwzbm" event={"ID":"c37bdfbc-3134-48bd-be57-e98d3707d752","Type":"ContainerDied","Data":"8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6"} Sep 30 06:49:44 crc kubenswrapper[4956]: I0930 06:49:44.136058 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwzbm" event={"ID":"c37bdfbc-3134-48bd-be57-e98d3707d752","Type":"ContainerStarted","Data":"eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6"} Sep 30 06:49:44 crc kubenswrapper[4956]: I0930 06:49:44.177211 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vwzbm" podStartSLOduration=2.638203527 podStartE2EDuration="6.17719051s" podCreationTimestamp="2025-09-30 06:49:38 +0000 UTC" firstStartedPulling="2025-09-30 06:49:40.063915521 +0000 UTC m=+4850.391036046" lastFinishedPulling="2025-09-30 06:49:43.602902504 +0000 UTC m=+4853.930023029" observedRunningTime="2025-09-30 06:49:44.169384956 +0000 UTC m=+4854.496505521" watchObservedRunningTime="2025-09-30 06:49:44.17719051 +0000 UTC m=+4854.504311025" Sep 30 06:49:45 crc kubenswrapper[4956]: I0930 06:49:45.637517 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:45 crc kubenswrapper[4956]: I0930 06:49:45.638608 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:45 crc kubenswrapper[4956]: I0930 06:49:45.685876 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:46 crc kubenswrapper[4956]: I0930 06:49:46.221958 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:48 crc kubenswrapper[4956]: I0930 06:49:48.074331 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:49:48 crc kubenswrapper[4956]: I0930 06:49:48.074860 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:49:48 crc kubenswrapper[4956]: I0930 06:49:48.212726 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6lkvm"] Sep 30 06:49:48 crc kubenswrapper[4956]: I0930 06:49:48.570932 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:48 crc kubenswrapper[4956]: I0930 06:49:48.571014 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.182447 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6lkvm" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerName="registry-server" containerID="cri-o://93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828" gracePeriod=2 Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.645640 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vwzbm" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="registry-server" probeResult="failure" output=< Sep 30 06:49:49 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Sep 30 06:49:49 crc kubenswrapper[4956]: > Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.681892 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.774883 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-catalog-content\") pod \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.775041 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5p2t\" (UniqueName: \"kubernetes.io/projected/4b7aa45a-7508-4940-a8f5-fa328fef90b8-kube-api-access-n5p2t\") pod \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.775091 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-utilities\") pod \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\" (UID: \"4b7aa45a-7508-4940-a8f5-fa328fef90b8\") " Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.775934 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-utilities" (OuterVolumeSpecName: "utilities") pod "4b7aa45a-7508-4940-a8f5-fa328fef90b8" (UID: "4b7aa45a-7508-4940-a8f5-fa328fef90b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.824375 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b7aa45a-7508-4940-a8f5-fa328fef90b8" (UID: "4b7aa45a-7508-4940-a8f5-fa328fef90b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.877465 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:49:49 crc kubenswrapper[4956]: I0930 06:49:49.877499 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7aa45a-7508-4940-a8f5-fa328fef90b8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.198181 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lkvm" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.198096 4956 generic.go:334] "Generic (PLEG): container finished" podID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerID="93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828" exitCode=0 Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.198179 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lkvm" event={"ID":"4b7aa45a-7508-4940-a8f5-fa328fef90b8","Type":"ContainerDied","Data":"93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828"} Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.198340 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lkvm" event={"ID":"4b7aa45a-7508-4940-a8f5-fa328fef90b8","Type":"ContainerDied","Data":"3d191419e93146ab2e3fae95a4e7b6de1aecfebe1756b702da2027c0d804218b"} Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.198372 4956 scope.go:117] "RemoveContainer" containerID="93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.223319 4956 scope.go:117] "RemoveContainer" containerID="3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.374072 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7aa45a-7508-4940-a8f5-fa328fef90b8-kube-api-access-n5p2t" (OuterVolumeSpecName: "kube-api-access-n5p2t") pod "4b7aa45a-7508-4940-a8f5-fa328fef90b8" (UID: "4b7aa45a-7508-4940-a8f5-fa328fef90b8"). InnerVolumeSpecName "kube-api-access-n5p2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.396597 4956 scope.go:117] "RemoveContainer" containerID="2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.399491 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5p2t\" (UniqueName: \"kubernetes.io/projected/4b7aa45a-7508-4940-a8f5-fa328fef90b8-kube-api-access-n5p2t\") on node \"crc\" DevicePath \"\"" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.491733 4956 scope.go:117] "RemoveContainer" containerID="93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828" Sep 30 06:49:50 crc kubenswrapper[4956]: E0930 06:49:50.492308 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828\": container with ID starting with 93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828 not found: ID does not exist" containerID="93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.492351 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828"} err="failed to get container status \"93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828\": rpc error: code = NotFound desc = could not find container \"93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828\": container with ID starting with 93a3b3d18b8268ec1347e65609000ca04d3780c5011e3e76fdbbfe452bbd2828 not found: ID does not exist" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.492383 4956 scope.go:117] "RemoveContainer" containerID="3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109" Sep 30 06:49:50 crc kubenswrapper[4956]: E0930 06:49:50.492822 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109\": container with ID starting with 3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109 not found: ID does not exist" containerID="3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.492864 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109"} err="failed to get container status \"3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109\": rpc error: code = NotFound desc = could not find container \"3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109\": container with ID starting with 3baeba990ff98a41919cb1745febfdcb913089c5937471235a9f619beb5f0109 not found: ID does not exist" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.492885 4956 scope.go:117] "RemoveContainer" containerID="2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb" Sep 30 06:49:50 crc kubenswrapper[4956]: E0930 06:49:50.493237 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb\": container with ID starting with 2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb not found: ID does not exist" containerID="2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.493270 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb"} err="failed to get container status \"2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb\": rpc error: code = NotFound desc = could not find container \"2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb\": container with ID starting with 2c142b91a50bda6c8dff01211f2f2e534888f031c2d8d5ae5adc81e8a8df97bb not found: ID does not exist" Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.557444 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6lkvm"] Sep 30 06:49:50 crc kubenswrapper[4956]: I0930 06:49:50.569521 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6lkvm"] Sep 30 06:49:52 crc kubenswrapper[4956]: I0930 06:49:52.362775 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" path="/var/lib/kubelet/pods/4b7aa45a-7508-4940-a8f5-fa328fef90b8/volumes" Sep 30 06:49:58 crc kubenswrapper[4956]: I0930 06:49:58.661902 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:58 crc kubenswrapper[4956]: I0930 06:49:58.735167 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:49:58 crc kubenswrapper[4956]: I0930 06:49:58.907300 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwzbm"] Sep 30 06:50:00 crc kubenswrapper[4956]: I0930 06:50:00.334099 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vwzbm" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="registry-server" containerID="cri-o://eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6" gracePeriod=2 Sep 30 06:50:00 crc kubenswrapper[4956]: I0930 06:50:00.871007 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.042770 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hgfg\" (UniqueName: \"kubernetes.io/projected/c37bdfbc-3134-48bd-be57-e98d3707d752-kube-api-access-6hgfg\") pod \"c37bdfbc-3134-48bd-be57-e98d3707d752\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.042934 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-catalog-content\") pod \"c37bdfbc-3134-48bd-be57-e98d3707d752\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.043088 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-utilities\") pod \"c37bdfbc-3134-48bd-be57-e98d3707d752\" (UID: \"c37bdfbc-3134-48bd-be57-e98d3707d752\") " Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.044027 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-utilities" (OuterVolumeSpecName: "utilities") pod "c37bdfbc-3134-48bd-be57-e98d3707d752" (UID: "c37bdfbc-3134-48bd-be57-e98d3707d752"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.052367 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c37bdfbc-3134-48bd-be57-e98d3707d752-kube-api-access-6hgfg" (OuterVolumeSpecName: "kube-api-access-6hgfg") pod "c37bdfbc-3134-48bd-be57-e98d3707d752" (UID: "c37bdfbc-3134-48bd-be57-e98d3707d752"). InnerVolumeSpecName "kube-api-access-6hgfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.146091 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hgfg\" (UniqueName: \"kubernetes.io/projected/c37bdfbc-3134-48bd-be57-e98d3707d752-kube-api-access-6hgfg\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.146172 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.159829 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c37bdfbc-3134-48bd-be57-e98d3707d752" (UID: "c37bdfbc-3134-48bd-be57-e98d3707d752"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.248834 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c37bdfbc-3134-48bd-be57-e98d3707d752-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.351154 4956 generic.go:334] "Generic (PLEG): container finished" podID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerID="eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6" exitCode=0 Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.351220 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwzbm" event={"ID":"c37bdfbc-3134-48bd-be57-e98d3707d752","Type":"ContainerDied","Data":"eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6"} Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.351286 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwzbm" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.351322 4956 scope.go:117] "RemoveContainer" containerID="eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.351299 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwzbm" event={"ID":"c37bdfbc-3134-48bd-be57-e98d3707d752","Type":"ContainerDied","Data":"307b29cd7b107ea088729d80e356ad3eeee53275f01ee3da1ef63bb7a1664172"} Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.383672 4956 scope.go:117] "RemoveContainer" containerID="8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.406999 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwzbm"] Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.423870 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vwzbm"] Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.432634 4956 scope.go:117] "RemoveContainer" containerID="ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.462578 4956 scope.go:117] "RemoveContainer" containerID="eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6" Sep 30 06:50:01 crc kubenswrapper[4956]: E0930 06:50:01.463161 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6\": container with ID starting with eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6 not found: ID does not exist" containerID="eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.463219 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6"} err="failed to get container status \"eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6\": rpc error: code = NotFound desc = could not find container \"eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6\": container with ID starting with eec618528df363f816388187e54b45572fc27a425dcc46d018a1fd6de2c8dbd6 not found: ID does not exist" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.463254 4956 scope.go:117] "RemoveContainer" containerID="8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6" Sep 30 06:50:01 crc kubenswrapper[4956]: E0930 06:50:01.463899 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6\": container with ID starting with 8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6 not found: ID does not exist" containerID="8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.463963 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6"} err="failed to get container status \"8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6\": rpc error: code = NotFound desc = could not find container \"8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6\": container with ID starting with 8a78c7f4386fb836bab4d95febb3b7c159f051e5abbd7696cf9fecc0619921f6 not found: ID does not exist" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.464008 4956 scope.go:117] "RemoveContainer" containerID="ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa" Sep 30 06:50:01 crc kubenswrapper[4956]: E0930 06:50:01.464406 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa\": container with ID starting with ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa not found: ID does not exist" containerID="ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa" Sep 30 06:50:01 crc kubenswrapper[4956]: I0930 06:50:01.464441 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa"} err="failed to get container status \"ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa\": rpc error: code = NotFound desc = could not find container \"ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa\": container with ID starting with ec5dafb267acb692c77a9faf136d9f88dcfd8189477b7edb94d4833a9d1bbeaa not found: ID does not exist" Sep 30 06:50:02 crc kubenswrapper[4956]: I0930 06:50:02.363999 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" path="/var/lib/kubelet/pods/c37bdfbc-3134-48bd-be57-e98d3707d752/volumes" Sep 30 06:50:18 crc kubenswrapper[4956]: I0930 06:50:18.073585 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:50:18 crc kubenswrapper[4956]: I0930 06:50:18.074510 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:50:48 crc kubenswrapper[4956]: I0930 06:50:48.073935 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:50:48 crc kubenswrapper[4956]: I0930 06:50:48.077443 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:50:48 crc kubenswrapper[4956]: I0930 06:50:48.077743 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:50:48 crc kubenswrapper[4956]: I0930 06:50:48.083774 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:50:48 crc kubenswrapper[4956]: I0930 06:50:48.084094 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" gracePeriod=600 Sep 30 06:50:48 crc kubenswrapper[4956]: E0930 06:50:48.216089 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:50:49 crc kubenswrapper[4956]: I0930 06:50:49.023910 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" exitCode=0 Sep 30 06:50:49 crc kubenswrapper[4956]: I0930 06:50:49.023987 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925"} Sep 30 06:50:49 crc kubenswrapper[4956]: I0930 06:50:49.024455 4956 scope.go:117] "RemoveContainer" containerID="e4217fa75f3f22deeaf4dead5680fd7c085512753ddfc7c30a460219d8671021" Sep 30 06:50:49 crc kubenswrapper[4956]: I0930 06:50:49.026294 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:50:49 crc kubenswrapper[4956]: E0930 06:50:49.027339 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:51:02 crc kubenswrapper[4956]: I0930 06:51:02.341537 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:51:02 crc kubenswrapper[4956]: E0930 06:51:02.342964 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:51:13 crc kubenswrapper[4956]: I0930 06:51:13.341526 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:51:13 crc kubenswrapper[4956]: E0930 06:51:13.343257 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:51:27 crc kubenswrapper[4956]: I0930 06:51:27.341973 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:51:27 crc kubenswrapper[4956]: E0930 06:51:27.342997 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:51:42 crc kubenswrapper[4956]: I0930 06:51:42.341809 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:51:42 crc kubenswrapper[4956]: E0930 06:51:42.343041 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:51:55 crc kubenswrapper[4956]: I0930 06:51:55.342299 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:51:55 crc kubenswrapper[4956]: E0930 06:51:55.343688 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:52:06 crc kubenswrapper[4956]: I0930 06:52:06.341679 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:52:06 crc kubenswrapper[4956]: E0930 06:52:06.343993 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:52:19 crc kubenswrapper[4956]: I0930 06:52:19.341409 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:52:19 crc kubenswrapper[4956]: E0930 06:52:19.342645 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:52:34 crc kubenswrapper[4956]: I0930 06:52:34.342204 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:52:34 crc kubenswrapper[4956]: E0930 06:52:34.343531 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:52:45 crc kubenswrapper[4956]: I0930 06:52:45.341678 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:52:45 crc kubenswrapper[4956]: E0930 06:52:45.342879 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:52:56 crc kubenswrapper[4956]: I0930 06:52:56.342352 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:52:56 crc kubenswrapper[4956]: E0930 06:52:56.343439 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.209614 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cpkfp"] Sep 30 06:53:01 crc kubenswrapper[4956]: E0930 06:53:01.211026 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="extract-utilities" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.211044 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="extract-utilities" Sep 30 06:53:01 crc kubenswrapper[4956]: E0930 06:53:01.211067 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerName="extract-utilities" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.211076 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerName="extract-utilities" Sep 30 06:53:01 crc kubenswrapper[4956]: E0930 06:53:01.211092 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="registry-server" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.211100 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="registry-server" Sep 30 06:53:01 crc kubenswrapper[4956]: E0930 06:53:01.211144 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerName="extract-content" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.211152 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerName="extract-content" Sep 30 06:53:01 crc kubenswrapper[4956]: E0930 06:53:01.211174 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="extract-content" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.211181 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="extract-content" Sep 30 06:53:01 crc kubenswrapper[4956]: E0930 06:53:01.211194 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerName="registry-server" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.211201 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerName="registry-server" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.211460 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c37bdfbc-3134-48bd-be57-e98d3707d752" containerName="registry-server" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.211474 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b7aa45a-7508-4940-a8f5-fa328fef90b8" containerName="registry-server" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.213265 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.221242 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cpkfp"] Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.281513 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-utilities\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.281587 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk8wl\" (UniqueName: \"kubernetes.io/projected/83f05d38-8459-4c35-9868-322fb1427a9d-kube-api-access-gk8wl\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.281636 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-catalog-content\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.384211 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-utilities\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.384301 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk8wl\" (UniqueName: \"kubernetes.io/projected/83f05d38-8459-4c35-9868-322fb1427a9d-kube-api-access-gk8wl\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.384357 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-catalog-content\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.384676 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-utilities\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.384911 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-catalog-content\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.412390 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk8wl\" (UniqueName: \"kubernetes.io/projected/83f05d38-8459-4c35-9868-322fb1427a9d-kube-api-access-gk8wl\") pod \"certified-operators-cpkfp\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:01 crc kubenswrapper[4956]: I0930 06:53:01.544034 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:02 crc kubenswrapper[4956]: I0930 06:53:02.076301 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cpkfp"] Sep 30 06:53:02 crc kubenswrapper[4956]: I0930 06:53:02.761322 4956 generic.go:334] "Generic (PLEG): container finished" podID="83f05d38-8459-4c35-9868-322fb1427a9d" containerID="459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134" exitCode=0 Sep 30 06:53:02 crc kubenswrapper[4956]: I0930 06:53:02.761409 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cpkfp" event={"ID":"83f05d38-8459-4c35-9868-322fb1427a9d","Type":"ContainerDied","Data":"459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134"} Sep 30 06:53:02 crc kubenswrapper[4956]: I0930 06:53:02.761658 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cpkfp" event={"ID":"83f05d38-8459-4c35-9868-322fb1427a9d","Type":"ContainerStarted","Data":"d22cdd4a79a59acbf4f7433e76294579422777110a714ea957ddb9226b077e02"} Sep 30 06:53:02 crc kubenswrapper[4956]: I0930 06:53:02.764740 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:53:04 crc kubenswrapper[4956]: I0930 06:53:04.784064 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cpkfp" event={"ID":"83f05d38-8459-4c35-9868-322fb1427a9d","Type":"ContainerStarted","Data":"bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d"} Sep 30 06:53:05 crc kubenswrapper[4956]: I0930 06:53:05.798851 4956 generic.go:334] "Generic (PLEG): container finished" podID="83f05d38-8459-4c35-9868-322fb1427a9d" containerID="bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d" exitCode=0 Sep 30 06:53:05 crc kubenswrapper[4956]: I0930 06:53:05.798960 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cpkfp" event={"ID":"83f05d38-8459-4c35-9868-322fb1427a9d","Type":"ContainerDied","Data":"bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d"} Sep 30 06:53:06 crc kubenswrapper[4956]: I0930 06:53:06.813544 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cpkfp" event={"ID":"83f05d38-8459-4c35-9868-322fb1427a9d","Type":"ContainerStarted","Data":"725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a"} Sep 30 06:53:06 crc kubenswrapper[4956]: I0930 06:53:06.836523 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cpkfp" podStartSLOduration=2.1776113280000002 podStartE2EDuration="5.836504797s" podCreationTimestamp="2025-09-30 06:53:01 +0000 UTC" firstStartedPulling="2025-09-30 06:53:02.764380688 +0000 UTC m=+5053.091501243" lastFinishedPulling="2025-09-30 06:53:06.423274187 +0000 UTC m=+5056.750394712" observedRunningTime="2025-09-30 06:53:06.833322517 +0000 UTC m=+5057.160443042" watchObservedRunningTime="2025-09-30 06:53:06.836504797 +0000 UTC m=+5057.163625322" Sep 30 06:53:11 crc kubenswrapper[4956]: I0930 06:53:11.341089 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:53:11 crc kubenswrapper[4956]: E0930 06:53:11.342075 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:53:11 crc kubenswrapper[4956]: I0930 06:53:11.544825 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:11 crc kubenswrapper[4956]: I0930 06:53:11.544903 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:11 crc kubenswrapper[4956]: I0930 06:53:11.613466 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:11 crc kubenswrapper[4956]: I0930 06:53:11.953958 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:12 crc kubenswrapper[4956]: I0930 06:53:12.025952 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cpkfp"] Sep 30 06:53:13 crc kubenswrapper[4956]: I0930 06:53:13.889335 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cpkfp" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" containerName="registry-server" containerID="cri-o://725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a" gracePeriod=2 Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.456894 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.575695 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-catalog-content\") pod \"83f05d38-8459-4c35-9868-322fb1427a9d\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.575921 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk8wl\" (UniqueName: \"kubernetes.io/projected/83f05d38-8459-4c35-9868-322fb1427a9d-kube-api-access-gk8wl\") pod \"83f05d38-8459-4c35-9868-322fb1427a9d\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.575963 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-utilities\") pod \"83f05d38-8459-4c35-9868-322fb1427a9d\" (UID: \"83f05d38-8459-4c35-9868-322fb1427a9d\") " Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.577481 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-utilities" (OuterVolumeSpecName: "utilities") pod "83f05d38-8459-4c35-9868-322fb1427a9d" (UID: "83f05d38-8459-4c35-9868-322fb1427a9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.600368 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f05d38-8459-4c35-9868-322fb1427a9d-kube-api-access-gk8wl" (OuterVolumeSpecName: "kube-api-access-gk8wl") pod "83f05d38-8459-4c35-9868-322fb1427a9d" (UID: "83f05d38-8459-4c35-9868-322fb1427a9d"). InnerVolumeSpecName "kube-api-access-gk8wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.624080 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83f05d38-8459-4c35-9868-322fb1427a9d" (UID: "83f05d38-8459-4c35-9868-322fb1427a9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.678780 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.678841 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk8wl\" (UniqueName: \"kubernetes.io/projected/83f05d38-8459-4c35-9868-322fb1427a9d-kube-api-access-gk8wl\") on node \"crc\" DevicePath \"\"" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.678857 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83f05d38-8459-4c35-9868-322fb1427a9d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.901798 4956 generic.go:334] "Generic (PLEG): container finished" podID="83f05d38-8459-4c35-9868-322fb1427a9d" containerID="725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a" exitCode=0 Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.901874 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cpkfp" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.901897 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cpkfp" event={"ID":"83f05d38-8459-4c35-9868-322fb1427a9d","Type":"ContainerDied","Data":"725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a"} Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.902829 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cpkfp" event={"ID":"83f05d38-8459-4c35-9868-322fb1427a9d","Type":"ContainerDied","Data":"d22cdd4a79a59acbf4f7433e76294579422777110a714ea957ddb9226b077e02"} Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.902870 4956 scope.go:117] "RemoveContainer" containerID="725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.930922 4956 scope.go:117] "RemoveContainer" containerID="bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d" Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.938348 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cpkfp"] Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.947999 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cpkfp"] Sep 30 06:53:14 crc kubenswrapper[4956]: I0930 06:53:14.973223 4956 scope.go:117] "RemoveContainer" containerID="459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134" Sep 30 06:53:15 crc kubenswrapper[4956]: I0930 06:53:15.015004 4956 scope.go:117] "RemoveContainer" containerID="725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a" Sep 30 06:53:15 crc kubenswrapper[4956]: E0930 06:53:15.016712 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a\": container with ID starting with 725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a not found: ID does not exist" containerID="725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a" Sep 30 06:53:15 crc kubenswrapper[4956]: I0930 06:53:15.016786 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a"} err="failed to get container status \"725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a\": rpc error: code = NotFound desc = could not find container \"725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a\": container with ID starting with 725f2067a4a7256c418a6689f49868ed241abe01f0aa16f1f072af239768af6a not found: ID does not exist" Sep 30 06:53:15 crc kubenswrapper[4956]: I0930 06:53:15.016835 4956 scope.go:117] "RemoveContainer" containerID="bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d" Sep 30 06:53:15 crc kubenswrapper[4956]: E0930 06:53:15.017375 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d\": container with ID starting with bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d not found: ID does not exist" containerID="bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d" Sep 30 06:53:15 crc kubenswrapper[4956]: I0930 06:53:15.017426 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d"} err="failed to get container status \"bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d\": rpc error: code = NotFound desc = could not find container \"bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d\": container with ID starting with bebbc5957a7b16f2c9da905ec415910a2b4eb6a6f856bedb6d96c2b7ad79eb2d not found: ID does not exist" Sep 30 06:53:15 crc kubenswrapper[4956]: I0930 06:53:15.017453 4956 scope.go:117] "RemoveContainer" containerID="459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134" Sep 30 06:53:15 crc kubenswrapper[4956]: E0930 06:53:15.017855 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134\": container with ID starting with 459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134 not found: ID does not exist" containerID="459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134" Sep 30 06:53:15 crc kubenswrapper[4956]: I0930 06:53:15.017901 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134"} err="failed to get container status \"459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134\": rpc error: code = NotFound desc = could not find container \"459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134\": container with ID starting with 459afdb6fe893af7c807de10e3c5ef7fb99c89017cd59207ab818bf08c081134 not found: ID does not exist" Sep 30 06:53:16 crc kubenswrapper[4956]: I0930 06:53:16.351538 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" path="/var/lib/kubelet/pods/83f05d38-8459-4c35-9868-322fb1427a9d/volumes" Sep 30 06:53:24 crc kubenswrapper[4956]: I0930 06:53:24.342030 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:53:24 crc kubenswrapper[4956]: E0930 06:53:24.343489 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:53:35 crc kubenswrapper[4956]: I0930 06:53:35.341630 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:53:35 crc kubenswrapper[4956]: E0930 06:53:35.342958 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:53:49 crc kubenswrapper[4956]: I0930 06:53:49.342047 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:53:49 crc kubenswrapper[4956]: E0930 06:53:49.343229 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:54:01 crc kubenswrapper[4956]: I0930 06:54:01.342026 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:54:01 crc kubenswrapper[4956]: E0930 06:54:01.342762 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:54:13 crc kubenswrapper[4956]: I0930 06:54:13.341943 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:54:13 crc kubenswrapper[4956]: E0930 06:54:13.342928 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:54:24 crc kubenswrapper[4956]: I0930 06:54:24.341998 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:54:24 crc kubenswrapper[4956]: E0930 06:54:24.342899 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:54:37 crc kubenswrapper[4956]: I0930 06:54:37.342670 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:54:37 crc kubenswrapper[4956]: E0930 06:54:37.344225 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:54:51 crc kubenswrapper[4956]: I0930 06:54:51.345148 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:54:51 crc kubenswrapper[4956]: E0930 06:54:51.346074 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:55:04 crc kubenswrapper[4956]: I0930 06:55:04.342173 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:55:04 crc kubenswrapper[4956]: E0930 06:55:04.342927 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:55:15 crc kubenswrapper[4956]: I0930 06:55:15.341753 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:55:15 crc kubenswrapper[4956]: E0930 06:55:15.342330 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:55:27 crc kubenswrapper[4956]: I0930 06:55:27.342370 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:55:27 crc kubenswrapper[4956]: E0930 06:55:27.343202 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:55:42 crc kubenswrapper[4956]: I0930 06:55:42.341142 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:55:42 crc kubenswrapper[4956]: E0930 06:55:42.341920 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 06:55:54 crc kubenswrapper[4956]: I0930 06:55:54.341586 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:55:54 crc kubenswrapper[4956]: I0930 06:55:54.755451 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"402d986c08b22b41e57c748a1f37841a5ee64c5f322e42940d7cbb921db697d7"} Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.028966 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z5sd5"] Sep 30 06:57:45 crc kubenswrapper[4956]: E0930 06:57:45.030418 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" containerName="extract-content" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.030437 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" containerName="extract-content" Sep 30 06:57:45 crc kubenswrapper[4956]: E0930 06:57:45.030472 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" containerName="extract-utilities" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.030483 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" containerName="extract-utilities" Sep 30 06:57:45 crc kubenswrapper[4956]: E0930 06:57:45.030502 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" containerName="registry-server" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.030509 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" containerName="registry-server" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.030797 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f05d38-8459-4c35-9868-322fb1427a9d" containerName="registry-server" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.032736 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.059021 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5sd5"] Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.112478 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-utilities\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.112755 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-catalog-content\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.112972 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfhwg\" (UniqueName: \"kubernetes.io/projected/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-kube-api-access-dfhwg\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.214719 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-catalog-content\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.214845 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfhwg\" (UniqueName: \"kubernetes.io/projected/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-kube-api-access-dfhwg\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.214890 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-utilities\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.215515 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-utilities\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.215807 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-catalog-content\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.237898 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfhwg\" (UniqueName: \"kubernetes.io/projected/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-kube-api-access-dfhwg\") pod \"redhat-marketplace-z5sd5\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.368616 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:45 crc kubenswrapper[4956]: I0930 06:57:45.874804 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5sd5"] Sep 30 06:57:46 crc kubenswrapper[4956]: I0930 06:57:46.054134 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5sd5" event={"ID":"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0","Type":"ContainerStarted","Data":"6120c73914153352afc2e5a89bfc52499a6d028701bd999f4bb4e3875bb26cea"} Sep 30 06:57:47 crc kubenswrapper[4956]: I0930 06:57:47.066650 4956 generic.go:334] "Generic (PLEG): container finished" podID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerID="ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6" exitCode=0 Sep 30 06:57:47 crc kubenswrapper[4956]: I0930 06:57:47.066746 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5sd5" event={"ID":"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0","Type":"ContainerDied","Data":"ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6"} Sep 30 06:57:49 crc kubenswrapper[4956]: I0930 06:57:49.086186 4956 generic.go:334] "Generic (PLEG): container finished" podID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerID="9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee" exitCode=0 Sep 30 06:57:49 crc kubenswrapper[4956]: I0930 06:57:49.086302 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5sd5" event={"ID":"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0","Type":"ContainerDied","Data":"9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee"} Sep 30 06:57:50 crc kubenswrapper[4956]: I0930 06:57:50.099625 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5sd5" event={"ID":"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0","Type":"ContainerStarted","Data":"e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb"} Sep 30 06:57:50 crc kubenswrapper[4956]: I0930 06:57:50.124494 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z5sd5" podStartSLOduration=2.383646592 podStartE2EDuration="5.124474388s" podCreationTimestamp="2025-09-30 06:57:45 +0000 UTC" firstStartedPulling="2025-09-30 06:57:47.068467592 +0000 UTC m=+5337.395588117" lastFinishedPulling="2025-09-30 06:57:49.809295388 +0000 UTC m=+5340.136415913" observedRunningTime="2025-09-30 06:57:50.121286078 +0000 UTC m=+5340.448406603" watchObservedRunningTime="2025-09-30 06:57:50.124474388 +0000 UTC m=+5340.451594913" Sep 30 06:57:55 crc kubenswrapper[4956]: I0930 06:57:55.369814 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:55 crc kubenswrapper[4956]: I0930 06:57:55.371281 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:56 crc kubenswrapper[4956]: I0930 06:57:56.208408 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:57 crc kubenswrapper[4956]: I0930 06:57:57.221290 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:57 crc kubenswrapper[4956]: I0930 06:57:57.271512 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5sd5"] Sep 30 06:57:59 crc kubenswrapper[4956]: I0930 06:57:59.179034 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z5sd5" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerName="registry-server" containerID="cri-o://e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb" gracePeriod=2 Sep 30 06:57:59 crc kubenswrapper[4956]: I0930 06:57:59.823385 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:57:59 crc kubenswrapper[4956]: I0930 06:57:59.926907 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-catalog-content\") pod \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " Sep 30 06:57:59 crc kubenswrapper[4956]: I0930 06:57:59.927452 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfhwg\" (UniqueName: \"kubernetes.io/projected/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-kube-api-access-dfhwg\") pod \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " Sep 30 06:57:59 crc kubenswrapper[4956]: I0930 06:57:59.927492 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-utilities\") pod \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\" (UID: \"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0\") " Sep 30 06:57:59 crc kubenswrapper[4956]: I0930 06:57:59.935174 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-kube-api-access-dfhwg" (OuterVolumeSpecName: "kube-api-access-dfhwg") pod "8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" (UID: "8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0"). InnerVolumeSpecName "kube-api-access-dfhwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:57:59 crc kubenswrapper[4956]: I0930 06:57:59.939690 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-utilities" (OuterVolumeSpecName: "utilities") pod "8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" (UID: "8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:57:59 crc kubenswrapper[4956]: I0930 06:57:59.946716 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" (UID: "8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.030005 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.030043 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfhwg\" (UniqueName: \"kubernetes.io/projected/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-kube-api-access-dfhwg\") on node \"crc\" DevicePath \"\"" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.030058 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.191015 4956 generic.go:334] "Generic (PLEG): container finished" podID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerID="e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb" exitCode=0 Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.191054 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5sd5" event={"ID":"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0","Type":"ContainerDied","Data":"e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb"} Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.191087 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5sd5" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.191107 4956 scope.go:117] "RemoveContainer" containerID="e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.191095 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5sd5" event={"ID":"8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0","Type":"ContainerDied","Data":"6120c73914153352afc2e5a89bfc52499a6d028701bd999f4bb4e3875bb26cea"} Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.222446 4956 scope.go:117] "RemoveContainer" containerID="9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.230814 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5sd5"] Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.244422 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5sd5"] Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.274332 4956 scope.go:117] "RemoveContainer" containerID="ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.306858 4956 scope.go:117] "RemoveContainer" containerID="e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb" Sep 30 06:58:00 crc kubenswrapper[4956]: E0930 06:58:00.307306 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb\": container with ID starting with e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb not found: ID does not exist" containerID="e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.307355 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb"} err="failed to get container status \"e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb\": rpc error: code = NotFound desc = could not find container \"e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb\": container with ID starting with e255c50529c99448024982f7a5650fa2cf4c253cd9261c92d8fbabb41d5d0edb not found: ID does not exist" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.307376 4956 scope.go:117] "RemoveContainer" containerID="9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee" Sep 30 06:58:00 crc kubenswrapper[4956]: E0930 06:58:00.307893 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee\": container with ID starting with 9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee not found: ID does not exist" containerID="9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.307958 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee"} err="failed to get container status \"9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee\": rpc error: code = NotFound desc = could not find container \"9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee\": container with ID starting with 9d6b6d2ddce281963f75a6c96ebc5d83185cc99b3b5fc11880fdaf0df42eb9ee not found: ID does not exist" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.307997 4956 scope.go:117] "RemoveContainer" containerID="ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6" Sep 30 06:58:00 crc kubenswrapper[4956]: E0930 06:58:00.308403 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6\": container with ID starting with ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6 not found: ID does not exist" containerID="ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.308430 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6"} err="failed to get container status \"ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6\": rpc error: code = NotFound desc = could not find container \"ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6\": container with ID starting with ada355b1d30983f7b60f14e87a2bbc3475496ad2c4f25105330086e93cce94a6 not found: ID does not exist" Sep 30 06:58:00 crc kubenswrapper[4956]: I0930 06:58:00.362408 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" path="/var/lib/kubelet/pods/8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0/volumes" Sep 30 06:58:18 crc kubenswrapper[4956]: I0930 06:58:18.073455 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:58:18 crc kubenswrapper[4956]: I0930 06:58:18.074089 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:58:48 crc kubenswrapper[4956]: I0930 06:58:48.073315 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:58:48 crc kubenswrapper[4956]: I0930 06:58:48.073804 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:59:13 crc kubenswrapper[4956]: I0930 06:59:13.938585 4956 generic.go:334] "Generic (PLEG): container finished" podID="c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" containerID="7416284bc5173a03ea9d1994287c83158f503af3d81d45dc8efb894751fe25dc" exitCode=0 Sep 30 06:59:13 crc kubenswrapper[4956]: I0930 06:59:13.938676 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69","Type":"ContainerDied","Data":"7416284bc5173a03ea9d1994287c83158f503af3d81d45dc8efb894751fe25dc"} Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.795824 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.960672 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69","Type":"ContainerDied","Data":"220592b4358ee1a1456cf05ae73ff297df2a78f03c6c5a4bb8d0d8b909119ce6"} Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.960714 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="220592b4358ee1a1456cf05ae73ff297df2a78f03c6c5a4bb8d0d8b909119ce6" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.960740 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963078 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-config-data\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963172 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pltrb\" (UniqueName: \"kubernetes.io/projected/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-kube-api-access-pltrb\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963252 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-workdir\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963313 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963434 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963493 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ca-certs\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963517 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ssh-key\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963540 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config-secret\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963575 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-temporary\") pod \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\" (UID: \"c822eb6a-ddf6-44d5-8a3c-35408a3a0f69\") " Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.963984 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-config-data" (OuterVolumeSpecName: "config-data") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.965710 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.969443 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.969511 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-kube-api-access-pltrb" (OuterVolumeSpecName: "kube-api-access-pltrb") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "kube-api-access-pltrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.970588 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.994817 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:59:15 crc kubenswrapper[4956]: I0930 06:59:15.995431 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.025354 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.027994 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" (UID: "c822eb6a-ddf6-44d5-8a3c-35408a3a0f69"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066146 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066179 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pltrb\" (UniqueName: \"kubernetes.io/projected/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-kube-api-access-pltrb\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066192 4956 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066202 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066237 4956 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066255 4956 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066267 4956 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066276 4956 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.066286 4956 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c822eb6a-ddf6-44d5-8a3c-35408a3a0f69-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.088284 4956 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 30 06:59:16 crc kubenswrapper[4956]: I0930 06:59:16.167514 4956 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.073626 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.075179 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.075306 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.076144 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"402d986c08b22b41e57c748a1f37841a5ee64c5f322e42940d7cbb921db697d7"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.076283 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://402d986c08b22b41e57c748a1f37841a5ee64c5f322e42940d7cbb921db697d7" gracePeriod=600 Sep 30 06:59:18 crc kubenswrapper[4956]: E0930 06:59:18.257873 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ecd015b_e216_40d8_ae78_711b2a65c193.slice/crio-402d986c08b22b41e57c748a1f37841a5ee64c5f322e42940d7cbb921db697d7.scope\": RecentStats: unable to find data in memory cache]" Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.991479 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="402d986c08b22b41e57c748a1f37841a5ee64c5f322e42940d7cbb921db697d7" exitCode=0 Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.991565 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"402d986c08b22b41e57c748a1f37841a5ee64c5f322e42940d7cbb921db697d7"} Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.991997 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223"} Sep 30 06:59:18 crc kubenswrapper[4956]: I0930 06:59:18.992030 4956 scope.go:117] "RemoveContainer" containerID="c875cb07b3c668921b14f9a3a5d6b3e55dc51fd0ae2463a2de780f3723c46925" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.019593 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 06:59:19 crc kubenswrapper[4956]: E0930 06:59:19.020215 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" containerName="tempest-tests-tempest-tests-runner" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.020239 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" containerName="tempest-tests-tempest-tests-runner" Sep 30 06:59:19 crc kubenswrapper[4956]: E0930 06:59:19.020262 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerName="extract-utilities" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.020271 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerName="extract-utilities" Sep 30 06:59:19 crc kubenswrapper[4956]: E0930 06:59:19.020289 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerName="extract-content" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.020298 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerName="extract-content" Sep 30 06:59:19 crc kubenswrapper[4956]: E0930 06:59:19.020337 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerName="registry-server" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.020345 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerName="registry-server" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.020605 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad2b269-99e2-4648-bbb3-9d4fc70e6ef0" containerName="registry-server" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.020635 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c822eb6a-ddf6-44d5-8a3c-35408a3a0f69" containerName="tempest-tests-tempest-tests-runner" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.021508 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.026520 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kdgrq" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.050737 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.132977 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0b31d8d5-0881-4a71-b953-2e70288e3190\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.133046 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgbq4\" (UniqueName: \"kubernetes.io/projected/0b31d8d5-0881-4a71-b953-2e70288e3190-kube-api-access-rgbq4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0b31d8d5-0881-4a71-b953-2e70288e3190\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.236633 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0b31d8d5-0881-4a71-b953-2e70288e3190\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.236728 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgbq4\" (UniqueName: \"kubernetes.io/projected/0b31d8d5-0881-4a71-b953-2e70288e3190-kube-api-access-rgbq4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0b31d8d5-0881-4a71-b953-2e70288e3190\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.237271 4956 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0b31d8d5-0881-4a71-b953-2e70288e3190\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.264286 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgbq4\" (UniqueName: \"kubernetes.io/projected/0b31d8d5-0881-4a71-b953-2e70288e3190-kube-api-access-rgbq4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0b31d8d5-0881-4a71-b953-2e70288e3190\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.282560 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0b31d8d5-0881-4a71-b953-2e70288e3190\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.365268 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.804873 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 06:59:19 crc kubenswrapper[4956]: I0930 06:59:19.810401 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:59:20 crc kubenswrapper[4956]: I0930 06:59:20.000919 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0b31d8d5-0881-4a71-b953-2e70288e3190","Type":"ContainerStarted","Data":"dee3a4556174bfd794eeddd25b39091d535f6f851fec52bc3f89b181933f47ff"} Sep 30 06:59:21 crc kubenswrapper[4956]: I0930 06:59:21.017196 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0b31d8d5-0881-4a71-b953-2e70288e3190","Type":"ContainerStarted","Data":"90ce958b63bd6c96fa8898aa8d1ad3d398ddf1f40c70af4df08a8ebe2395d86a"} Sep 30 06:59:21 crc kubenswrapper[4956]: I0930 06:59:21.034161 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.246535951 podStartE2EDuration="3.03414333s" podCreationTimestamp="2025-09-30 06:59:18 +0000 UTC" firstStartedPulling="2025-09-30 06:59:19.810056739 +0000 UTC m=+5430.137177284" lastFinishedPulling="2025-09-30 06:59:20.597664138 +0000 UTC m=+5430.924784663" observedRunningTime="2025-09-30 06:59:21.032907272 +0000 UTC m=+5431.360027797" watchObservedRunningTime="2025-09-30 06:59:21.03414333 +0000 UTC m=+5431.361263855" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.686625 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-24gnb/must-gather-lvqn6"] Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.689932 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.700249 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-24gnb"/"kube-root-ca.crt" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.700304 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-24gnb"/"openshift-service-ca.crt" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.720290 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-24gnb/must-gather-lvqn6"] Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.738555 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjzfn\" (UniqueName: \"kubernetes.io/projected/597bee6b-c39b-402a-bdce-6d8ac7243b62-kube-api-access-qjzfn\") pod \"must-gather-lvqn6\" (UID: \"597bee6b-c39b-402a-bdce-6d8ac7243b62\") " pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.738817 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/597bee6b-c39b-402a-bdce-6d8ac7243b62-must-gather-output\") pod \"must-gather-lvqn6\" (UID: \"597bee6b-c39b-402a-bdce-6d8ac7243b62\") " pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.840516 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjzfn\" (UniqueName: \"kubernetes.io/projected/597bee6b-c39b-402a-bdce-6d8ac7243b62-kube-api-access-qjzfn\") pod \"must-gather-lvqn6\" (UID: \"597bee6b-c39b-402a-bdce-6d8ac7243b62\") " pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.840569 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/597bee6b-c39b-402a-bdce-6d8ac7243b62-must-gather-output\") pod \"must-gather-lvqn6\" (UID: \"597bee6b-c39b-402a-bdce-6d8ac7243b62\") " pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.841098 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/597bee6b-c39b-402a-bdce-6d8ac7243b62-must-gather-output\") pod \"must-gather-lvqn6\" (UID: \"597bee6b-c39b-402a-bdce-6d8ac7243b62\") " pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 06:59:38 crc kubenswrapper[4956]: I0930 06:59:38.861282 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjzfn\" (UniqueName: \"kubernetes.io/projected/597bee6b-c39b-402a-bdce-6d8ac7243b62-kube-api-access-qjzfn\") pod \"must-gather-lvqn6\" (UID: \"597bee6b-c39b-402a-bdce-6d8ac7243b62\") " pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 06:59:39 crc kubenswrapper[4956]: I0930 06:59:39.028950 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 06:59:39 crc kubenswrapper[4956]: I0930 06:59:39.564946 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-24gnb/must-gather-lvqn6"] Sep 30 06:59:39 crc kubenswrapper[4956]: W0930 06:59:39.577432 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod597bee6b_c39b_402a_bdce_6d8ac7243b62.slice/crio-573c0ce1bada751bfde6fec7bc381717c260a72a873ce8389966c2c64da59854 WatchSource:0}: Error finding container 573c0ce1bada751bfde6fec7bc381717c260a72a873ce8389966c2c64da59854: Status 404 returned error can't find the container with id 573c0ce1bada751bfde6fec7bc381717c260a72a873ce8389966c2c64da59854 Sep 30 06:59:40 crc kubenswrapper[4956]: I0930 06:59:40.211909 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/must-gather-lvqn6" event={"ID":"597bee6b-c39b-402a-bdce-6d8ac7243b62","Type":"ContainerStarted","Data":"573c0ce1bada751bfde6fec7bc381717c260a72a873ce8389966c2c64da59854"} Sep 30 06:59:47 crc kubenswrapper[4956]: I0930 06:59:47.283046 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/must-gather-lvqn6" event={"ID":"597bee6b-c39b-402a-bdce-6d8ac7243b62","Type":"ContainerStarted","Data":"89f2b471941e4cb6939a5ff07ac14e8c107cd5a9d38ad4e56afb06e65a956f6a"} Sep 30 06:59:48 crc kubenswrapper[4956]: I0930 06:59:48.293589 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/must-gather-lvqn6" event={"ID":"597bee6b-c39b-402a-bdce-6d8ac7243b62","Type":"ContainerStarted","Data":"9c5d5a2b0292cc93f8d7d891e62a97ac199ace2d19a3d4ecccb50601318f4159"} Sep 30 06:59:48 crc kubenswrapper[4956]: I0930 06:59:48.315543 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-24gnb/must-gather-lvqn6" podStartSLOduration=2.982154723 podStartE2EDuration="10.315523498s" podCreationTimestamp="2025-09-30 06:59:38 +0000 UTC" firstStartedPulling="2025-09-30 06:59:39.580276339 +0000 UTC m=+5449.907396864" lastFinishedPulling="2025-09-30 06:59:46.913645114 +0000 UTC m=+5457.240765639" observedRunningTime="2025-09-30 06:59:48.30567392 +0000 UTC m=+5458.632794465" watchObservedRunningTime="2025-09-30 06:59:48.315523498 +0000 UTC m=+5458.642644023" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.630872 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-24gnb/crc-debug-h8655"] Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.632767 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.635071 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-24gnb"/"default-dockercfg-4zng4" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.754639 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4e46d8f-8f7c-4521-8b88-879aa4a93236-host\") pod \"crc-debug-h8655\" (UID: \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\") " pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.754694 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct7dx\" (UniqueName: \"kubernetes.io/projected/e4e46d8f-8f7c-4521-8b88-879aa4a93236-kube-api-access-ct7dx\") pod \"crc-debug-h8655\" (UID: \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\") " pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.856760 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4e46d8f-8f7c-4521-8b88-879aa4a93236-host\") pod \"crc-debug-h8655\" (UID: \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\") " pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.856835 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct7dx\" (UniqueName: \"kubernetes.io/projected/e4e46d8f-8f7c-4521-8b88-879aa4a93236-kube-api-access-ct7dx\") pod \"crc-debug-h8655\" (UID: \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\") " pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.856885 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4e46d8f-8f7c-4521-8b88-879aa4a93236-host\") pod \"crc-debug-h8655\" (UID: \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\") " pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.879695 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct7dx\" (UniqueName: \"kubernetes.io/projected/e4e46d8f-8f7c-4521-8b88-879aa4a93236-kube-api-access-ct7dx\") pod \"crc-debug-h8655\" (UID: \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\") " pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 06:59:51 crc kubenswrapper[4956]: I0930 06:59:51.952743 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 06:59:51 crc kubenswrapper[4956]: W0930 06:59:51.984974 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e46d8f_8f7c_4521_8b88_879aa4a93236.slice/crio-e9068f22140c259aeddec318e0ecbbff33e7aee7ee43602c050cef3670b7ddd1 WatchSource:0}: Error finding container e9068f22140c259aeddec318e0ecbbff33e7aee7ee43602c050cef3670b7ddd1: Status 404 returned error can't find the container with id e9068f22140c259aeddec318e0ecbbff33e7aee7ee43602c050cef3670b7ddd1 Sep 30 06:59:52 crc kubenswrapper[4956]: I0930 06:59:52.379467 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/crc-debug-h8655" event={"ID":"e4e46d8f-8f7c-4521-8b88-879aa4a93236","Type":"ContainerStarted","Data":"e9068f22140c259aeddec318e0ecbbff33e7aee7ee43602c050cef3670b7ddd1"} Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.148446 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c"] Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.150838 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.154772 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.155518 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.163847 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c"] Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.237665 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqbs\" (UniqueName: \"kubernetes.io/projected/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-kube-api-access-jzqbs\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.237966 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-secret-volume\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.238210 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-config-volume\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.341345 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-config-volume\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.341534 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqbs\" (UniqueName: \"kubernetes.io/projected/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-kube-api-access-jzqbs\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.341631 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-secret-volume\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.342806 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-config-volume\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.356190 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-secret-volume\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.364178 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqbs\" (UniqueName: \"kubernetes.io/projected/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-kube-api-access-jzqbs\") pod \"collect-profiles-29320260-xk86c\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:00 crc kubenswrapper[4956]: I0930 07:00:00.479072 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:04 crc kubenswrapper[4956]: I0930 07:00:04.707483 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c"] Sep 30 07:00:06 crc kubenswrapper[4956]: I0930 07:00:06.539547 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" event={"ID":"eaa6ec4b-0657-408e-a609-86de0cb4f4dc","Type":"ContainerStarted","Data":"afa8e50372c5df9a5a435526495f2dd705bd7a677a40a4315a7f751b9af11f39"} Sep 30 07:00:09 crc kubenswrapper[4956]: I0930 07:00:09.588371 4956 generic.go:334] "Generic (PLEG): container finished" podID="eaa6ec4b-0657-408e-a609-86de0cb4f4dc" containerID="3cab5dacc93304dde59e6837f1adddac148525a077d578d35c530db516893c28" exitCode=0 Sep 30 07:00:09 crc kubenswrapper[4956]: I0930 07:00:09.588816 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" event={"ID":"eaa6ec4b-0657-408e-a609-86de0cb4f4dc","Type":"ContainerDied","Data":"3cab5dacc93304dde59e6837f1adddac148525a077d578d35c530db516893c28"} Sep 30 07:00:09 crc kubenswrapper[4956]: I0930 07:00:09.591575 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/crc-debug-h8655" event={"ID":"e4e46d8f-8f7c-4521-8b88-879aa4a93236","Type":"ContainerStarted","Data":"78359b6f6108a72e384d82f51054651ca3b51a3fe73dbcdb16d205622ff3fad3"} Sep 30 07:00:09 crc kubenswrapper[4956]: I0930 07:00:09.670725 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-24gnb/crc-debug-h8655" podStartSLOduration=3.945692161 podStartE2EDuration="18.67068361s" podCreationTimestamp="2025-09-30 06:59:51 +0000 UTC" firstStartedPulling="2025-09-30 06:59:51.98807426 +0000 UTC m=+5462.315194785" lastFinishedPulling="2025-09-30 07:00:06.713065699 +0000 UTC m=+5477.040186234" observedRunningTime="2025-09-30 07:00:09.655297108 +0000 UTC m=+5479.982417633" watchObservedRunningTime="2025-09-30 07:00:09.67068361 +0000 UTC m=+5479.997804135" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.056923 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.139997 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-secret-volume\") pod \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.140264 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqbs\" (UniqueName: \"kubernetes.io/projected/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-kube-api-access-jzqbs\") pod \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.140463 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-config-volume\") pod \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\" (UID: \"eaa6ec4b-0657-408e-a609-86de0cb4f4dc\") " Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.141456 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-config-volume" (OuterVolumeSpecName: "config-volume") pod "eaa6ec4b-0657-408e-a609-86de0cb4f4dc" (UID: "eaa6ec4b-0657-408e-a609-86de0cb4f4dc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.141805 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.148389 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eaa6ec4b-0657-408e-a609-86de0cb4f4dc" (UID: "eaa6ec4b-0657-408e-a609-86de0cb4f4dc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.154364 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-kube-api-access-jzqbs" (OuterVolumeSpecName: "kube-api-access-jzqbs") pod "eaa6ec4b-0657-408e-a609-86de0cb4f4dc" (UID: "eaa6ec4b-0657-408e-a609-86de0cb4f4dc"). InnerVolumeSpecName "kube-api-access-jzqbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.243753 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.243824 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzqbs\" (UniqueName: \"kubernetes.io/projected/eaa6ec4b-0657-408e-a609-86de0cb4f4dc-kube-api-access-jzqbs\") on node \"crc\" DevicePath \"\"" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.612842 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" event={"ID":"eaa6ec4b-0657-408e-a609-86de0cb4f4dc","Type":"ContainerDied","Data":"afa8e50372c5df9a5a435526495f2dd705bd7a677a40a4315a7f751b9af11f39"} Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.613287 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afa8e50372c5df9a5a435526495f2dd705bd7a677a40a4315a7f751b9af11f39" Sep 30 07:00:11 crc kubenswrapper[4956]: I0930 07:00:11.613190 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-xk86c" Sep 30 07:00:12 crc kubenswrapper[4956]: I0930 07:00:12.157352 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj"] Sep 30 07:00:12 crc kubenswrapper[4956]: I0930 07:00:12.167201 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-6z7zj"] Sep 30 07:00:12 crc kubenswrapper[4956]: I0930 07:00:12.361405 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d539a2-c896-4f86-a2ba-6609af83673b" path="/var/lib/kubelet/pods/92d539a2-c896-4f86-a2ba-6609af83673b/volumes" Sep 30 07:00:15 crc kubenswrapper[4956]: I0930 07:00:15.948086 4956 scope.go:117] "RemoveContainer" containerID="4c185e26652bc5538cd47def6201cab4721e3feffd31072266428d96db08c111" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.145385 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p7vbz"] Sep 30 07:00:47 crc kubenswrapper[4956]: E0930 07:00:47.146475 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa6ec4b-0657-408e-a609-86de0cb4f4dc" containerName="collect-profiles" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.146488 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa6ec4b-0657-408e-a609-86de0cb4f4dc" containerName="collect-profiles" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.146681 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa6ec4b-0657-408e-a609-86de0cb4f4dc" containerName="collect-profiles" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.148364 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.158510 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7vbz"] Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.194812 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-utilities\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.194891 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-catalog-content\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.195044 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfh4\" (UniqueName: \"kubernetes.io/projected/f53e2103-05b5-42b9-9ca7-014f5407c290-kube-api-access-ddfh4\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.297802 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-utilities\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.298491 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-catalog-content\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.298627 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfh4\" (UniqueName: \"kubernetes.io/projected/f53e2103-05b5-42b9-9ca7-014f5407c290-kube-api-access-ddfh4\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.299058 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-catalog-content\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.299278 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-utilities\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.335456 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfh4\" (UniqueName: \"kubernetes.io/projected/f53e2103-05b5-42b9-9ca7-014f5407c290-kube-api-access-ddfh4\") pod \"redhat-operators-p7vbz\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:47 crc kubenswrapper[4956]: I0930 07:00:47.473650 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:00:48 crc kubenswrapper[4956]: I0930 07:00:48.359732 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7vbz"] Sep 30 07:00:49 crc kubenswrapper[4956]: I0930 07:00:49.000914 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7vbz" event={"ID":"f53e2103-05b5-42b9-9ca7-014f5407c290","Type":"ContainerStarted","Data":"4b65d15966394e9fef0b34ee368a18d94ab6e2ca38079759a29f068aaa0924e3"} Sep 30 07:00:49 crc kubenswrapper[4956]: I0930 07:00:49.001279 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7vbz" event={"ID":"f53e2103-05b5-42b9-9ca7-014f5407c290","Type":"ContainerStarted","Data":"83ba32e766486859abd4237c96ae727aa72be72b8778c4388325ad59a6c9bcd3"} Sep 30 07:00:50 crc kubenswrapper[4956]: I0930 07:00:50.011802 4956 generic.go:334] "Generic (PLEG): container finished" podID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerID="4b65d15966394e9fef0b34ee368a18d94ab6e2ca38079759a29f068aaa0924e3" exitCode=0 Sep 30 07:00:50 crc kubenswrapper[4956]: I0930 07:00:50.012387 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7vbz" event={"ID":"f53e2103-05b5-42b9-9ca7-014f5407c290","Type":"ContainerDied","Data":"4b65d15966394e9fef0b34ee368a18d94ab6e2ca38079759a29f068aaa0924e3"} Sep 30 07:00:54 crc kubenswrapper[4956]: I0930 07:00:54.066474 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7vbz" event={"ID":"f53e2103-05b5-42b9-9ca7-014f5407c290","Type":"ContainerStarted","Data":"654ebede850381428bb5e06b7b61c2d42c13593da9d8d489bc04db4a71f7dfef"} Sep 30 07:00:59 crc kubenswrapper[4956]: I0930 07:00:59.108475 4956 generic.go:334] "Generic (PLEG): container finished" podID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerID="654ebede850381428bb5e06b7b61c2d42c13593da9d8d489bc04db4a71f7dfef" exitCode=0 Sep 30 07:00:59 crc kubenswrapper[4956]: I0930 07:00:59.109180 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7vbz" event={"ID":"f53e2103-05b5-42b9-9ca7-014f5407c290","Type":"ContainerDied","Data":"654ebede850381428bb5e06b7b61c2d42c13593da9d8d489bc04db4a71f7dfef"} Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.150358 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320261-p55tp"] Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.155807 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.166527 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320261-p55tp"] Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.260626 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-fernet-keys\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.260687 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-config-data\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.260797 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24bgl\" (UniqueName: \"kubernetes.io/projected/f9787291-2195-4229-910e-88f41de812c8-kube-api-access-24bgl\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.260835 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-combined-ca-bundle\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.361995 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-fernet-keys\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.362043 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-config-data\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.362107 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24bgl\" (UniqueName: \"kubernetes.io/projected/f9787291-2195-4229-910e-88f41de812c8-kube-api-access-24bgl\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.363164 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-combined-ca-bundle\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.372004 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-fernet-keys\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.373313 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-combined-ca-bundle\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.374174 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-config-data\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.380941 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24bgl\" (UniqueName: \"kubernetes.io/projected/f9787291-2195-4229-910e-88f41de812c8-kube-api-access-24bgl\") pod \"keystone-cron-29320261-p55tp\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:00 crc kubenswrapper[4956]: I0930 07:01:00.493544 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:01 crc kubenswrapper[4956]: I0930 07:01:01.020386 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320261-p55tp"] Sep 30 07:01:02 crc kubenswrapper[4956]: I0930 07:01:02.133327 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320261-p55tp" event={"ID":"f9787291-2195-4229-910e-88f41de812c8","Type":"ContainerStarted","Data":"3e3409830150c52e2b5d5c6c9ae305906775a7a7a48058eac826ee4629f082ec"} Sep 30 07:01:04 crc kubenswrapper[4956]: I0930 07:01:04.156439 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320261-p55tp" event={"ID":"f9787291-2195-4229-910e-88f41de812c8","Type":"ContainerStarted","Data":"ac6ae3d52c53dd763639773c6c8a6f67fc1f8972e204660416ad755c1107aa52"} Sep 30 07:01:05 crc kubenswrapper[4956]: I0930 07:01:05.201669 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320261-p55tp" podStartSLOduration=5.201650681 podStartE2EDuration="5.201650681s" podCreationTimestamp="2025-09-30 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:01:05.195455926 +0000 UTC m=+5535.522576451" watchObservedRunningTime="2025-09-30 07:01:05.201650681 +0000 UTC m=+5535.528771216" Sep 30 07:01:06 crc kubenswrapper[4956]: I0930 07:01:06.186665 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7vbz" event={"ID":"f53e2103-05b5-42b9-9ca7-014f5407c290","Type":"ContainerStarted","Data":"3e589ac469c6548acaecef1b6bd952ff486e4f9ead7172ecc6b212033181878c"} Sep 30 07:01:06 crc kubenswrapper[4956]: I0930 07:01:06.211140 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p7vbz" podStartSLOduration=3.752586908 podStartE2EDuration="19.211107934s" podCreationTimestamp="2025-09-30 07:00:47 +0000 UTC" firstStartedPulling="2025-09-30 07:00:50.016175914 +0000 UTC m=+5520.343296439" lastFinishedPulling="2025-09-30 07:01:05.47469695 +0000 UTC m=+5535.801817465" observedRunningTime="2025-09-30 07:01:06.204211218 +0000 UTC m=+5536.531331743" watchObservedRunningTime="2025-09-30 07:01:06.211107934 +0000 UTC m=+5536.538228459" Sep 30 07:01:07 crc kubenswrapper[4956]: I0930 07:01:07.474828 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:01:07 crc kubenswrapper[4956]: I0930 07:01:07.475196 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:01:08 crc kubenswrapper[4956]: I0930 07:01:08.526364 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p7vbz" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="registry-server" probeResult="failure" output=< Sep 30 07:01:08 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Sep 30 07:01:08 crc kubenswrapper[4956]: > Sep 30 07:01:11 crc kubenswrapper[4956]: I0930 07:01:11.235236 4956 generic.go:334] "Generic (PLEG): container finished" podID="f9787291-2195-4229-910e-88f41de812c8" containerID="ac6ae3d52c53dd763639773c6c8a6f67fc1f8972e204660416ad755c1107aa52" exitCode=0 Sep 30 07:01:11 crc kubenswrapper[4956]: I0930 07:01:11.235329 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320261-p55tp" event={"ID":"f9787291-2195-4229-910e-88f41de812c8","Type":"ContainerDied","Data":"ac6ae3d52c53dd763639773c6c8a6f67fc1f8972e204660416ad755c1107aa52"} Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.644512 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.769696 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-fernet-keys\") pod \"f9787291-2195-4229-910e-88f41de812c8\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.769785 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-combined-ca-bundle\") pod \"f9787291-2195-4229-910e-88f41de812c8\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.769864 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-config-data\") pod \"f9787291-2195-4229-910e-88f41de812c8\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.769985 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24bgl\" (UniqueName: \"kubernetes.io/projected/f9787291-2195-4229-910e-88f41de812c8-kube-api-access-24bgl\") pod \"f9787291-2195-4229-910e-88f41de812c8\" (UID: \"f9787291-2195-4229-910e-88f41de812c8\") " Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.778025 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f9787291-2195-4229-910e-88f41de812c8" (UID: "f9787291-2195-4229-910e-88f41de812c8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.800793 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9787291-2195-4229-910e-88f41de812c8-kube-api-access-24bgl" (OuterVolumeSpecName: "kube-api-access-24bgl") pod "f9787291-2195-4229-910e-88f41de812c8" (UID: "f9787291-2195-4229-910e-88f41de812c8"). InnerVolumeSpecName "kube-api-access-24bgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.807003 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9787291-2195-4229-910e-88f41de812c8" (UID: "f9787291-2195-4229-910e-88f41de812c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.836363 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-config-data" (OuterVolumeSpecName: "config-data") pod "f9787291-2195-4229-910e-88f41de812c8" (UID: "f9787291-2195-4229-910e-88f41de812c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.872763 4956 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.872797 4956 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.872811 4956 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9787291-2195-4229-910e-88f41de812c8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:12 crc kubenswrapper[4956]: I0930 07:01:12.872824 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24bgl\" (UniqueName: \"kubernetes.io/projected/f9787291-2195-4229-910e-88f41de812c8-kube-api-access-24bgl\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:13 crc kubenswrapper[4956]: I0930 07:01:13.255979 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320261-p55tp" event={"ID":"f9787291-2195-4229-910e-88f41de812c8","Type":"ContainerDied","Data":"3e3409830150c52e2b5d5c6c9ae305906775a7a7a48058eac826ee4629f082ec"} Sep 30 07:01:13 crc kubenswrapper[4956]: I0930 07:01:13.256035 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320261-p55tp" Sep 30 07:01:13 crc kubenswrapper[4956]: I0930 07:01:13.256054 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e3409830150c52e2b5d5c6c9ae305906775a7a7a48058eac826ee4629f082ec" Sep 30 07:01:18 crc kubenswrapper[4956]: I0930 07:01:18.073268 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:01:18 crc kubenswrapper[4956]: I0930 07:01:18.074085 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:01:18 crc kubenswrapper[4956]: I0930 07:01:18.614242 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p7vbz" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="registry-server" probeResult="failure" output=< Sep 30 07:01:18 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Sep 30 07:01:18 crc kubenswrapper[4956]: > Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.473695 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-646f99fb9d-9lthp_45e3d2ce-91f3-420a-b8a2-ebeb4c113565/barbican-api/0.log" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.514177 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-msj55"] Sep 30 07:01:24 crc kubenswrapper[4956]: E0930 07:01:24.514791 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9787291-2195-4229-910e-88f41de812c8" containerName="keystone-cron" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.514808 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9787291-2195-4229-910e-88f41de812c8" containerName="keystone-cron" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.515091 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9787291-2195-4229-910e-88f41de812c8" containerName="keystone-cron" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.516735 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.527303 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-msj55"] Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.589490 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-646f99fb9d-9lthp_45e3d2ce-91f3-420a-b8a2-ebeb4c113565/barbican-api-log/0.log" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.642783 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-utilities\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.642976 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-catalog-content\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.643074 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44bpk\" (UniqueName: \"kubernetes.io/projected/7fb8a8c1-947f-405d-b390-b4248b390580-kube-api-access-44bpk\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.748514 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-catalog-content\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.748595 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44bpk\" (UniqueName: \"kubernetes.io/projected/7fb8a8c1-947f-405d-b390-b4248b390580-kube-api-access-44bpk\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.748712 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-utilities\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.749306 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-utilities\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.749354 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-catalog-content\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.753878 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-96cd79c6-2b9nr_bc0eca45-f776-4d77-8589-a2605f824696/barbican-keystone-listener/0.log" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.777028 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44bpk\" (UniqueName: \"kubernetes.io/projected/7fb8a8c1-947f-405d-b390-b4248b390580-kube-api-access-44bpk\") pod \"community-operators-msj55\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.849956 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:24 crc kubenswrapper[4956]: I0930 07:01:24.989007 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-96cd79c6-2b9nr_bc0eca45-f776-4d77-8589-a2605f824696/barbican-keystone-listener-log/0.log" Sep 30 07:01:25 crc kubenswrapper[4956]: I0930 07:01:25.348175 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-msj55"] Sep 30 07:01:25 crc kubenswrapper[4956]: W0930 07:01:25.399326 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fb8a8c1_947f_405d_b390_b4248b390580.slice/crio-0196b31d0f3c0c011ad051a4f7be53b16293153514963bf5d556ee8031bfcb2f WatchSource:0}: Error finding container 0196b31d0f3c0c011ad051a4f7be53b16293153514963bf5d556ee8031bfcb2f: Status 404 returned error can't find the container with id 0196b31d0f3c0c011ad051a4f7be53b16293153514963bf5d556ee8031bfcb2f Sep 30 07:01:25 crc kubenswrapper[4956]: I0930 07:01:25.502242 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56bb449d8f-bh9xp_6c9006da-cf02-4bed-8247-02b6e929ff98/barbican-worker/0.log" Sep 30 07:01:25 crc kubenswrapper[4956]: I0930 07:01:25.591051 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56bb449d8f-bh9xp_6c9006da-cf02-4bed-8247-02b6e929ff98/barbican-worker-log/0.log" Sep 30 07:01:25 crc kubenswrapper[4956]: I0930 07:01:25.779270 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv_dc55cabb-5e01-4012-a02f-27ee023df0c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.157882 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c3e1cc75-7672-44f7-acb3-69ef2ae910d1/ceilometer-notification-agent/0.log" Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.157965 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c3e1cc75-7672-44f7-acb3-69ef2ae910d1/proxy-httpd/0.log" Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.202389 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c3e1cc75-7672-44f7-acb3-69ef2ae910d1/ceilometer-central-agent/0.log" Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.393301 4956 generic.go:334] "Generic (PLEG): container finished" podID="7fb8a8c1-947f-405d-b390-b4248b390580" containerID="91615f64c2aacab22e2a2e90c91faa07117dfa2f92fe4aa0716cbd1aaa68f889" exitCode=0 Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.393340 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msj55" event={"ID":"7fb8a8c1-947f-405d-b390-b4248b390580","Type":"ContainerDied","Data":"91615f64c2aacab22e2a2e90c91faa07117dfa2f92fe4aa0716cbd1aaa68f889"} Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.393366 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msj55" event={"ID":"7fb8a8c1-947f-405d-b390-b4248b390580","Type":"ContainerStarted","Data":"0196b31d0f3c0c011ad051a4f7be53b16293153514963bf5d556ee8031bfcb2f"} Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.416408 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c3e1cc75-7672-44f7-acb3-69ef2ae910d1/sg-core/0.log" Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.720087 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b0e586a-4f48-4d87-9ecb-732f5723e089/cinder-api/0.log" Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.803165 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b0e586a-4f48-4d87-9ecb-732f5723e089/cinder-api-log/0.log" Sep 30 07:01:26 crc kubenswrapper[4956]: I0930 07:01:26.969832 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6a945d1b-fce5-4069-8fb9-6c483a712cd2/cinder-scheduler/0.log" Sep 30 07:01:27 crc kubenswrapper[4956]: I0930 07:01:27.104658 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6a945d1b-fce5-4069-8fb9-6c483a712cd2/probe/0.log" Sep 30 07:01:27 crc kubenswrapper[4956]: I0930 07:01:27.976711 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm_7577cb07-1bcb-4432-a6e8-f57c1f1b2421/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:27 crc kubenswrapper[4956]: I0930 07:01:27.979901 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d_ec470ec9-3b1f-409d-ab54-44bb44daf1fe/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.093928 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-dnptc_c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee/init/0.log" Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.338765 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-dnptc_c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee/init/0.log" Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.390201 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6_6f2a0586-6940-498b-8eaa-1f16bf0ea2fa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.572149 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-dnptc_c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee/dnsmasq-dns/0.log" Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.692345 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a834394a-7f87-43b1-aebb-e61b5916077c/glance-httpd/0.log" Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.719410 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a834394a-7f87-43b1-aebb-e61b5916077c/glance-log/0.log" Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.723647 4956 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p7vbz" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="registry-server" probeResult="failure" output=< Sep 30 07:01:28 crc kubenswrapper[4956]: timeout: failed to connect service ":50051" within 1s Sep 30 07:01:28 crc kubenswrapper[4956]: > Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.917201 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d68b4b8-0213-4951-bf44-a8f7c6a1677c/glance-httpd/0.log" Sep 30 07:01:28 crc kubenswrapper[4956]: I0930 07:01:28.927574 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d68b4b8-0213-4951-bf44-a8f7c6a1677c/glance-log/0.log" Sep 30 07:01:29 crc kubenswrapper[4956]: I0930 07:01:29.226271 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5f96b888bb-bhtl9_f29ac7f2-13b9-47d8-9218-fb08840e6704/horizon/0.log" Sep 30 07:01:29 crc kubenswrapper[4956]: I0930 07:01:29.312459 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp_278c7daa-7016-4bde-8424-bd0c3491cb3c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:29 crc kubenswrapper[4956]: I0930 07:01:29.437922 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msj55" event={"ID":"7fb8a8c1-947f-405d-b390-b4248b390580","Type":"ContainerStarted","Data":"101bf26b121710da1ea3912ae521ba3adf3f6dfee1f352129e37d67038796cb9"} Sep 30 07:01:29 crc kubenswrapper[4956]: I0930 07:01:29.813920 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5f96b888bb-bhtl9_f29ac7f2-13b9-47d8-9218-fb08840e6704/horizon-log/0.log" Sep 30 07:01:30 crc kubenswrapper[4956]: I0930 07:01:30.023062 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tnwj7_cc76e3e2-ccde-4622-bcd2-fc347ee16271/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:30 crc kubenswrapper[4956]: I0930 07:01:30.267922 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320201-dhqp5_356a925e-f5c8-48b3-b62c-5c80f7566d01/keystone-cron/0.log" Sep 30 07:01:30 crc kubenswrapper[4956]: I0930 07:01:30.520652 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320261-p55tp_f9787291-2195-4229-910e-88f41de812c8/keystone-cron/0.log" Sep 30 07:01:30 crc kubenswrapper[4956]: I0930 07:01:30.537226 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6798cf9d78-m426q_1997765b-9597-4f93-a11a-8df4f572dee4/keystone-api/0.log" Sep 30 07:01:30 crc kubenswrapper[4956]: I0930 07:01:30.662948 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1b53d536-4a4f-463d-bae5-360555cd4583/kube-state-metrics/0.log" Sep 30 07:01:31 crc kubenswrapper[4956]: I0930 07:01:31.011461 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-scntk_16fa92a8-7fcd-45bd-9b5a-f77149ec71f4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:31 crc kubenswrapper[4956]: I0930 07:01:31.397271 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cbdcdc45c-b9667_0a1e613a-d4fe-4779-97f9-a931d68f083c/neutron-api/0.log" Sep 30 07:01:31 crc kubenswrapper[4956]: I0930 07:01:31.629019 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c_f99ede59-582c-4ed5-99e4-6ba65d66aedb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:31 crc kubenswrapper[4956]: I0930 07:01:31.658202 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cbdcdc45c-b9667_0a1e613a-d4fe-4779-97f9-a931d68f083c/neutron-httpd/0.log" Sep 30 07:01:32 crc kubenswrapper[4956]: I0930 07:01:32.785458 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c3275745-4918-4509-ab41-ad6e6653bcc8/nova-cell0-conductor-conductor/0.log" Sep 30 07:01:33 crc kubenswrapper[4956]: I0930 07:01:33.473136 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0515ec62-950a-4ab8-8462-7030b37609db/nova-cell1-conductor-conductor/0.log" Sep 30 07:01:33 crc kubenswrapper[4956]: I0930 07:01:33.485223 4956 generic.go:334] "Generic (PLEG): container finished" podID="7fb8a8c1-947f-405d-b390-b4248b390580" containerID="101bf26b121710da1ea3912ae521ba3adf3f6dfee1f352129e37d67038796cb9" exitCode=0 Sep 30 07:01:33 crc kubenswrapper[4956]: I0930 07:01:33.485388 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msj55" event={"ID":"7fb8a8c1-947f-405d-b390-b4248b390580","Type":"ContainerDied","Data":"101bf26b121710da1ea3912ae521ba3adf3f6dfee1f352129e37d67038796cb9"} Sep 30 07:01:33 crc kubenswrapper[4956]: I0930 07:01:33.771068 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6a7e115a-efb9-4ea4-aed6-efa6d4b80203/nova-api-log/0.log" Sep 30 07:01:34 crc kubenswrapper[4956]: I0930 07:01:34.166018 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5b82f755-357a-4b4e-84bd-1712077f17a5/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 07:01:34 crc kubenswrapper[4956]: I0930 07:01:34.214916 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6a7e115a-efb9-4ea4-aed6-efa6d4b80203/nova-api-api/0.log" Sep 30 07:01:34 crc kubenswrapper[4956]: I0930 07:01:34.462210 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-n4vfr_8c77505e-cdca-4f43-a276-2102b2c33a58/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:34 crc kubenswrapper[4956]: I0930 07:01:34.558620 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0effab61-b755-4bd2-afb3-71cdf7983dc3/nova-metadata-log/0.log" Sep 30 07:01:35 crc kubenswrapper[4956]: I0930 07:01:35.276885 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_94c3b340-db8b-4ec8-8554-556d914309a2/nova-scheduler-scheduler/0.log" Sep 30 07:01:35 crc kubenswrapper[4956]: I0930 07:01:35.396742 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e95031-1e0d-4979-9926-ba52d0208646/mysql-bootstrap/0.log" Sep 30 07:01:35 crc kubenswrapper[4956]: I0930 07:01:35.533447 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msj55" event={"ID":"7fb8a8c1-947f-405d-b390-b4248b390580","Type":"ContainerStarted","Data":"e2640f2ef1e8f1d134635c93a1cbeaba80635bc59bb63d96ca639ddc22679691"} Sep 30 07:01:35 crc kubenswrapper[4956]: I0930 07:01:35.563795 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-msj55" podStartSLOduration=3.437175839 podStartE2EDuration="11.56377668s" podCreationTimestamp="2025-09-30 07:01:24 +0000 UTC" firstStartedPulling="2025-09-30 07:01:26.39529046 +0000 UTC m=+5556.722410985" lastFinishedPulling="2025-09-30 07:01:34.521891301 +0000 UTC m=+5564.849011826" observedRunningTime="2025-09-30 07:01:35.563174482 +0000 UTC m=+5565.890295017" watchObservedRunningTime="2025-09-30 07:01:35.56377668 +0000 UTC m=+5565.890897215" Sep 30 07:01:35 crc kubenswrapper[4956]: I0930 07:01:35.884614 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e95031-1e0d-4979-9926-ba52d0208646/mysql-bootstrap/0.log" Sep 30 07:01:35 crc kubenswrapper[4956]: I0930 07:01:35.905490 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e95031-1e0d-4979-9926-ba52d0208646/galera/0.log" Sep 30 07:01:36 crc kubenswrapper[4956]: I0930 07:01:36.186607 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_541322a8-d098-4331-ab5b-500262d4655c/mysql-bootstrap/0.log" Sep 30 07:01:36 crc kubenswrapper[4956]: I0930 07:01:36.581953 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_541322a8-d098-4331-ab5b-500262d4655c/mysql-bootstrap/0.log" Sep 30 07:01:36 crc kubenswrapper[4956]: I0930 07:01:36.590981 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_541322a8-d098-4331-ab5b-500262d4655c/galera/0.log" Sep 30 07:01:37 crc kubenswrapper[4956]: I0930 07:01:37.114268 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0effab61-b755-4bd2-afb3-71cdf7983dc3/nova-metadata-metadata/0.log" Sep 30 07:01:37 crc kubenswrapper[4956]: I0930 07:01:37.288861 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f596b95b-7b2b-4d7b-8f33-9eb214a39a21/openstackclient/0.log" Sep 30 07:01:37 crc kubenswrapper[4956]: I0930 07:01:37.528465 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:01:37 crc kubenswrapper[4956]: I0930 07:01:37.601219 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hfrcl_351735b0-6497-4a6c-9562-1ad2785ead5f/openstack-network-exporter/0.log" Sep 30 07:01:37 crc kubenswrapper[4956]: I0930 07:01:37.603884 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:01:37 crc kubenswrapper[4956]: I0930 07:01:37.702944 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9lvd_f61a8fc3-4802-4dfb-b17e-fd4b8db1b863/ovsdb-server-init/0.log" Sep 30 07:01:37 crc kubenswrapper[4956]: I0930 07:01:37.951875 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9lvd_f61a8fc3-4802-4dfb-b17e-fd4b8db1b863/ovsdb-server-init/0.log" Sep 30 07:01:38 crc kubenswrapper[4956]: I0930 07:01:38.001745 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9lvd_f61a8fc3-4802-4dfb-b17e-fd4b8db1b863/ovsdb-server/0.log" Sep 30 07:01:38 crc kubenswrapper[4956]: I0930 07:01:38.236187 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-r7wfs_611676cd-11d2-44c4-bae2-41b6b22f898d/ovn-controller/0.log" Sep 30 07:01:38 crc kubenswrapper[4956]: I0930 07:01:38.378512 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9lvd_f61a8fc3-4802-4dfb-b17e-fd4b8db1b863/ovs-vswitchd/0.log" Sep 30 07:01:38 crc kubenswrapper[4956]: I0930 07:01:38.498529 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7vbz"] Sep 30 07:01:38 crc kubenswrapper[4956]: I0930 07:01:38.532284 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fb24p_a6c38cbf-b86e-473b-8f78-b917dc31d239/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:38 crc kubenswrapper[4956]: I0930 07:01:38.580323 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p7vbz" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="registry-server" containerID="cri-o://3e589ac469c6548acaecef1b6bd952ff486e4f9ead7172ecc6b212033181878c" gracePeriod=2 Sep 30 07:01:39 crc kubenswrapper[4956]: I0930 07:01:39.193179 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c0fedcc9-8df5-495f-adb8-a42a2a811c49/openstack-network-exporter/0.log" Sep 30 07:01:39 crc kubenswrapper[4956]: I0930 07:01:39.275708 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c0fedcc9-8df5-495f-adb8-a42a2a811c49/ovn-northd/0.log" Sep 30 07:01:39 crc kubenswrapper[4956]: I0930 07:01:39.449291 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_82458356-9089-4c9d-a672-746eb618af3d/openstack-network-exporter/0.log" Sep 30 07:01:39 crc kubenswrapper[4956]: I0930 07:01:39.481179 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_82458356-9089-4c9d-a672-746eb618af3d/ovsdbserver-nb/0.log" Sep 30 07:01:39 crc kubenswrapper[4956]: I0930 07:01:39.593307 4956 generic.go:334] "Generic (PLEG): container finished" podID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerID="3e589ac469c6548acaecef1b6bd952ff486e4f9ead7172ecc6b212033181878c" exitCode=0 Sep 30 07:01:39 crc kubenswrapper[4956]: I0930 07:01:39.593355 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7vbz" event={"ID":"f53e2103-05b5-42b9-9ca7-014f5407c290","Type":"ContainerDied","Data":"3e589ac469c6548acaecef1b6bd952ff486e4f9ead7172ecc6b212033181878c"} Sep 30 07:01:39 crc kubenswrapper[4956]: I0930 07:01:39.993669 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd6f01ba-ec4b-4b98-8a33-e029d86b258b/openstack-network-exporter/0.log" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.062624 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd6f01ba-ec4b-4b98-8a33-e029d86b258b/ovsdbserver-sb/0.log" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.203738 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.317165 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-utilities\") pod \"f53e2103-05b5-42b9-9ca7-014f5407c290\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.317208 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-catalog-content\") pod \"f53e2103-05b5-42b9-9ca7-014f5407c290\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.317396 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddfh4\" (UniqueName: \"kubernetes.io/projected/f53e2103-05b5-42b9-9ca7-014f5407c290-kube-api-access-ddfh4\") pod \"f53e2103-05b5-42b9-9ca7-014f5407c290\" (UID: \"f53e2103-05b5-42b9-9ca7-014f5407c290\") " Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.318360 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-utilities" (OuterVolumeSpecName: "utilities") pod "f53e2103-05b5-42b9-9ca7-014f5407c290" (UID: "f53e2103-05b5-42b9-9ca7-014f5407c290"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.342307 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53e2103-05b5-42b9-9ca7-014f5407c290-kube-api-access-ddfh4" (OuterVolumeSpecName: "kube-api-access-ddfh4") pod "f53e2103-05b5-42b9-9ca7-014f5407c290" (UID: "f53e2103-05b5-42b9-9ca7-014f5407c290"). InnerVolumeSpecName "kube-api-access-ddfh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.419474 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.419517 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddfh4\" (UniqueName: \"kubernetes.io/projected/f53e2103-05b5-42b9-9ca7-014f5407c290-kube-api-access-ddfh4\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.436632 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f53e2103-05b5-42b9-9ca7-014f5407c290" (UID: "f53e2103-05b5-42b9-9ca7-014f5407c290"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.512092 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-654d9b45dd-f9lqj_17e5ec46-ef77-490f-b564-b3d4426dd9c8/placement-api/0.log" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.520728 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53e2103-05b5-42b9-9ca7-014f5407c290-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.582421 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-654d9b45dd-f9lqj_17e5ec46-ef77-490f-b564-b3d4426dd9c8/placement-log/0.log" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.615584 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7vbz" event={"ID":"f53e2103-05b5-42b9-9ca7-014f5407c290","Type":"ContainerDied","Data":"83ba32e766486859abd4237c96ae727aa72be72b8778c4388325ad59a6c9bcd3"} Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.615636 4956 scope.go:117] "RemoveContainer" containerID="3e589ac469c6548acaecef1b6bd952ff486e4f9ead7172ecc6b212033181878c" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.615774 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7vbz" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.637259 4956 scope.go:117] "RemoveContainer" containerID="654ebede850381428bb5e06b7b61c2d42c13593da9d8d489bc04db4a71f7dfef" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.667762 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7vbz"] Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.684243 4956 scope.go:117] "RemoveContainer" containerID="4b65d15966394e9fef0b34ee368a18d94ab6e2ca38079759a29f068aaa0924e3" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.695234 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p7vbz"] Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.781408 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/init-config-reloader/0.log" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.947008 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/init-config-reloader/0.log" Sep 30 07:01:40 crc kubenswrapper[4956]: I0930 07:01:40.997187 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/config-reloader/0.log" Sep 30 07:01:41 crc kubenswrapper[4956]: I0930 07:01:41.051145 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/prometheus/0.log" Sep 30 07:01:41 crc kubenswrapper[4956]: I0930 07:01:41.239052 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/thanos-sidecar/0.log" Sep 30 07:01:41 crc kubenswrapper[4956]: I0930 07:01:41.381793 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a47b02d9-bc18-4fa1-a0a9-1918de176de9/setup-container/0.log" Sep 30 07:01:41 crc kubenswrapper[4956]: I0930 07:01:41.678258 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a47b02d9-bc18-4fa1-a0a9-1918de176de9/setup-container/0.log" Sep 30 07:01:41 crc kubenswrapper[4956]: I0930 07:01:41.727930 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a47b02d9-bc18-4fa1-a0a9-1918de176de9/rabbitmq/0.log" Sep 30 07:01:41 crc kubenswrapper[4956]: I0930 07:01:41.944385 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_de3a8c94-71b5-4948-9079-cc7009b9a8ea/setup-container/0.log" Sep 30 07:01:42 crc kubenswrapper[4956]: I0930 07:01:42.150638 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_de3a8c94-71b5-4948-9079-cc7009b9a8ea/rabbitmq/0.log" Sep 30 07:01:42 crc kubenswrapper[4956]: I0930 07:01:42.170326 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_de3a8c94-71b5-4948-9079-cc7009b9a8ea/setup-container/0.log" Sep 30 07:01:42 crc kubenswrapper[4956]: I0930 07:01:42.373552 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" path="/var/lib/kubelet/pods/f53e2103-05b5-42b9-9ca7-014f5407c290/volumes" Sep 30 07:01:42 crc kubenswrapper[4956]: I0930 07:01:42.459239 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ac3f1eed-d128-4008-bbe5-0f319495ef52/setup-container/0.log" Sep 30 07:01:42 crc kubenswrapper[4956]: I0930 07:01:42.643046 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ac3f1eed-d128-4008-bbe5-0f319495ef52/setup-container/0.log" Sep 30 07:01:42 crc kubenswrapper[4956]: I0930 07:01:42.698381 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ac3f1eed-d128-4008-bbe5-0f319495ef52/rabbitmq/0.log" Sep 30 07:01:42 crc kubenswrapper[4956]: I0930 07:01:42.967909 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk_17e89a19-be9f-454b-b1c9-8f5b813a9a3b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:43 crc kubenswrapper[4956]: I0930 07:01:43.081848 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6c6wt_70901491-8063-4429-baee-c1a295960e2c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:43 crc kubenswrapper[4956]: I0930 07:01:43.321448 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-fc627_2b60bb30-87ca-43de-a737-1a0fc105197e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:43 crc kubenswrapper[4956]: I0930 07:01:43.795825 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-785q7_fd6826cb-1434-4917-a836-ae952394b1ca/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:43 crc kubenswrapper[4956]: I0930 07:01:43.849710 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-d8mjv_9ad04c2f-1706-4b52-9267-613f86dc0388/ssh-known-hosts-edpm-deployment/0.log" Sep 30 07:01:44 crc kubenswrapper[4956]: I0930 07:01:44.182968 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-779b5888b9-9hp77_62113d6a-1e88-402d-b6bd-4f119a6df416/proxy-server/0.log" Sep 30 07:01:44 crc kubenswrapper[4956]: I0930 07:01:44.412562 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2zwhw_547b221a-e3b1-4a31-b09c-07022356f1e9/swift-ring-rebalance/0.log" Sep 30 07:01:44 crc kubenswrapper[4956]: I0930 07:01:44.430044 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-779b5888b9-9hp77_62113d6a-1e88-402d-b6bd-4f119a6df416/proxy-httpd/0.log" Sep 30 07:01:44 crc kubenswrapper[4956]: I0930 07:01:44.699491 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/account-auditor/0.log" Sep 30 07:01:44 crc kubenswrapper[4956]: I0930 07:01:44.757768 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/account-reaper/0.log" Sep 30 07:01:44 crc kubenswrapper[4956]: I0930 07:01:44.851010 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:44 crc kubenswrapper[4956]: I0930 07:01:44.851063 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:44 crc kubenswrapper[4956]: I0930 07:01:44.914734 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.094483 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/container-auditor/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.140120 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/account-server/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.140691 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/account-replicator/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.344685 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/container-server/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.345707 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/container-replicator/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.440272 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/container-updater/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.608870 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-auditor/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.623348 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-expirer/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.774882 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.787063 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-replicator/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.836279 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-msj55"] Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.883348 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-server/0.log" Sep 30 07:01:45 crc kubenswrapper[4956]: I0930 07:01:45.938252 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-updater/0.log" Sep 30 07:01:46 crc kubenswrapper[4956]: I0930 07:01:46.104727 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/rsync/0.log" Sep 30 07:01:46 crc kubenswrapper[4956]: I0930 07:01:46.162453 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/swift-recon-cron/0.log" Sep 30 07:01:46 crc kubenswrapper[4956]: I0930 07:01:46.414496 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c_67054ea3-1f2b-43dd-ada6-8908e9f7c2de/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:46 crc kubenswrapper[4956]: I0930 07:01:46.522253 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c822eb6a-ddf6-44d5-8a3c-35408a3a0f69/tempest-tests-tempest-tests-runner/0.log" Sep 30 07:01:46 crc kubenswrapper[4956]: I0930 07:01:46.665716 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0b31d8d5-0881-4a71-b953-2e70288e3190/test-operator-logs-container/0.log" Sep 30 07:01:46 crc kubenswrapper[4956]: I0930 07:01:46.934019 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-4j92f_8861310e-59f9-47fc-9224-fc01da1aab28/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:01:47 crc kubenswrapper[4956]: I0930 07:01:47.750360 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-msj55" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" containerName="registry-server" containerID="cri-o://e2640f2ef1e8f1d134635c93a1cbeaba80635bc59bb63d96ca639ddc22679691" gracePeriod=2 Sep 30 07:01:48 crc kubenswrapper[4956]: I0930 07:01:48.060249 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_48193dc0-3638-40e3-8b54-2b2049bd5925/watcher-applier/0.log" Sep 30 07:01:48 crc kubenswrapper[4956]: I0930 07:01:48.073837 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:01:48 crc kubenswrapper[4956]: I0930 07:01:48.073902 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:01:48 crc kubenswrapper[4956]: I0930 07:01:48.559866 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5268a4a4-b72f-47e6-a485-04dcc5935087/watcher-api-log/0.log" Sep 30 07:01:48 crc kubenswrapper[4956]: I0930 07:01:48.792282 4956 generic.go:334] "Generic (PLEG): container finished" podID="7fb8a8c1-947f-405d-b390-b4248b390580" containerID="e2640f2ef1e8f1d134635c93a1cbeaba80635bc59bb63d96ca639ddc22679691" exitCode=0 Sep 30 07:01:48 crc kubenswrapper[4956]: I0930 07:01:48.792341 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msj55" event={"ID":"7fb8a8c1-947f-405d-b390-b4248b390580","Type":"ContainerDied","Data":"e2640f2ef1e8f1d134635c93a1cbeaba80635bc59bb63d96ca639ddc22679691"} Sep 30 07:01:48 crc kubenswrapper[4956]: I0930 07:01:48.859737 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_aad83b1a-8ea2-4f43-a6e3-b8e844a65115/watcher-decision-engine/2.log" Sep 30 07:01:48 crc kubenswrapper[4956]: I0930 07:01:48.993349 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.157950 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-utilities\") pod \"7fb8a8c1-947f-405d-b390-b4248b390580\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.158002 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-catalog-content\") pod \"7fb8a8c1-947f-405d-b390-b4248b390580\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.158226 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44bpk\" (UniqueName: \"kubernetes.io/projected/7fb8a8c1-947f-405d-b390-b4248b390580-kube-api-access-44bpk\") pod \"7fb8a8c1-947f-405d-b390-b4248b390580\" (UID: \"7fb8a8c1-947f-405d-b390-b4248b390580\") " Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.167831 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb8a8c1-947f-405d-b390-b4248b390580-kube-api-access-44bpk" (OuterVolumeSpecName: "kube-api-access-44bpk") pod "7fb8a8c1-947f-405d-b390-b4248b390580" (UID: "7fb8a8c1-947f-405d-b390-b4248b390580"). InnerVolumeSpecName "kube-api-access-44bpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.175964 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-utilities" (OuterVolumeSpecName: "utilities") pod "7fb8a8c1-947f-405d-b390-b4248b390580" (UID: "7fb8a8c1-947f-405d-b390-b4248b390580"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.262874 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.263311 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44bpk\" (UniqueName: \"kubernetes.io/projected/7fb8a8c1-947f-405d-b390-b4248b390580-kube-api-access-44bpk\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.330934 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fb8a8c1-947f-405d-b390-b4248b390580" (UID: "7fb8a8c1-947f-405d-b390-b4248b390580"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.366357 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb8a8c1-947f-405d-b390-b4248b390580-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.810490 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-msj55" event={"ID":"7fb8a8c1-947f-405d-b390-b4248b390580","Type":"ContainerDied","Data":"0196b31d0f3c0c011ad051a4f7be53b16293153514963bf5d556ee8031bfcb2f"} Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.810570 4956 scope.go:117] "RemoveContainer" containerID="e2640f2ef1e8f1d134635c93a1cbeaba80635bc59bb63d96ca639ddc22679691" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.810814 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-msj55" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.851273 4956 scope.go:117] "RemoveContainer" containerID="101bf26b121710da1ea3912ae521ba3adf3f6dfee1f352129e37d67038796cb9" Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.860216 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-msj55"] Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.866965 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-msj55"] Sep 30 07:01:49 crc kubenswrapper[4956]: I0930 07:01:49.891101 4956 scope.go:117] "RemoveContainer" containerID="91615f64c2aacab22e2a2e90c91faa07117dfa2f92fe4aa0716cbd1aaa68f889" Sep 30 07:01:50 crc kubenswrapper[4956]: I0930 07:01:50.357222 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" path="/var/lib/kubelet/pods/7fb8a8c1-947f-405d-b390-b4248b390580/volumes" Sep 30 07:01:52 crc kubenswrapper[4956]: I0930 07:01:52.756098 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_aad83b1a-8ea2-4f43-a6e3-b8e844a65115/watcher-decision-engine/3.log" Sep 30 07:01:53 crc kubenswrapper[4956]: I0930 07:01:53.532542 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5268a4a4-b72f-47e6-a485-04dcc5935087/watcher-api/0.log" Sep 30 07:01:59 crc kubenswrapper[4956]: I0930 07:01:59.336770 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b5126f1e-65cb-4032-9df3-8cd061c43253/memcached/0.log" Sep 30 07:02:18 crc kubenswrapper[4956]: I0930 07:02:18.073409 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:02:18 crc kubenswrapper[4956]: I0930 07:02:18.073959 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:02:18 crc kubenswrapper[4956]: I0930 07:02:18.074000 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 07:02:18 crc kubenswrapper[4956]: I0930 07:02:18.074741 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:02:18 crc kubenswrapper[4956]: I0930 07:02:18.074786 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" gracePeriod=600 Sep 30 07:02:18 crc kubenswrapper[4956]: E0930 07:02:18.216823 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:02:19 crc kubenswrapper[4956]: I0930 07:02:19.098254 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" exitCode=0 Sep 30 07:02:19 crc kubenswrapper[4956]: I0930 07:02:19.098491 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223"} Sep 30 07:02:19 crc kubenswrapper[4956]: I0930 07:02:19.099048 4956 scope.go:117] "RemoveContainer" containerID="402d986c08b22b41e57c748a1f37841a5ee64c5f322e42940d7cbb921db697d7" Sep 30 07:02:19 crc kubenswrapper[4956]: I0930 07:02:19.102193 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:02:19 crc kubenswrapper[4956]: E0930 07:02:19.103498 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:02:30 crc kubenswrapper[4956]: I0930 07:02:30.355170 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:02:30 crc kubenswrapper[4956]: E0930 07:02:30.357401 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:02:43 crc kubenswrapper[4956]: I0930 07:02:43.341353 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:02:43 crc kubenswrapper[4956]: E0930 07:02:43.342539 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:02:43 crc kubenswrapper[4956]: I0930 07:02:43.376024 4956 generic.go:334] "Generic (PLEG): container finished" podID="e4e46d8f-8f7c-4521-8b88-879aa4a93236" containerID="78359b6f6108a72e384d82f51054651ca3b51a3fe73dbcdb16d205622ff3fad3" exitCode=0 Sep 30 07:02:43 crc kubenswrapper[4956]: I0930 07:02:43.376070 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/crc-debug-h8655" event={"ID":"e4e46d8f-8f7c-4521-8b88-879aa4a93236","Type":"ContainerDied","Data":"78359b6f6108a72e384d82f51054651ca3b51a3fe73dbcdb16d205622ff3fad3"} Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.508833 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.564172 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-24gnb/crc-debug-h8655"] Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.575075 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-24gnb/crc-debug-h8655"] Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.607019 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4e46d8f-8f7c-4521-8b88-879aa4a93236-host\") pod \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\" (UID: \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\") " Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.607257 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct7dx\" (UniqueName: \"kubernetes.io/projected/e4e46d8f-8f7c-4521-8b88-879aa4a93236-kube-api-access-ct7dx\") pod \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\" (UID: \"e4e46d8f-8f7c-4521-8b88-879aa4a93236\") " Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.609176 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4e46d8f-8f7c-4521-8b88-879aa4a93236-host" (OuterVolumeSpecName: "host") pod "e4e46d8f-8f7c-4521-8b88-879aa4a93236" (UID: "e4e46d8f-8f7c-4521-8b88-879aa4a93236"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.632340 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e46d8f-8f7c-4521-8b88-879aa4a93236-kube-api-access-ct7dx" (OuterVolumeSpecName: "kube-api-access-ct7dx") pod "e4e46d8f-8f7c-4521-8b88-879aa4a93236" (UID: "e4e46d8f-8f7c-4521-8b88-879aa4a93236"). InnerVolumeSpecName "kube-api-access-ct7dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.712190 4956 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4e46d8f-8f7c-4521-8b88-879aa4a93236-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:02:44 crc kubenswrapper[4956]: I0930 07:02:44.712227 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct7dx\" (UniqueName: \"kubernetes.io/projected/e4e46d8f-8f7c-4521-8b88-879aa4a93236-kube-api-access-ct7dx\") on node \"crc\" DevicePath \"\"" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.396512 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9068f22140c259aeddec318e0ecbbff33e7aee7ee43602c050cef3670b7ddd1" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.396717 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-h8655" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.850389 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-24gnb/crc-debug-jsjww"] Sep 30 07:02:45 crc kubenswrapper[4956]: E0930 07:02:45.851089 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="extract-content" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851105 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="extract-content" Sep 30 07:02:45 crc kubenswrapper[4956]: E0930 07:02:45.851141 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" containerName="registry-server" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851149 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" containerName="registry-server" Sep 30 07:02:45 crc kubenswrapper[4956]: E0930 07:02:45.851165 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="registry-server" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851174 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="registry-server" Sep 30 07:02:45 crc kubenswrapper[4956]: E0930 07:02:45.851187 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="extract-utilities" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851194 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="extract-utilities" Sep 30 07:02:45 crc kubenswrapper[4956]: E0930 07:02:45.851202 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" containerName="extract-utilities" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851209 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" containerName="extract-utilities" Sep 30 07:02:45 crc kubenswrapper[4956]: E0930 07:02:45.851227 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" containerName="extract-content" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851234 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" containerName="extract-content" Sep 30 07:02:45 crc kubenswrapper[4956]: E0930 07:02:45.851266 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e46d8f-8f7c-4521-8b88-879aa4a93236" containerName="container-00" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851275 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e46d8f-8f7c-4521-8b88-879aa4a93236" containerName="container-00" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851501 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb8a8c1-947f-405d-b390-b4248b390580" containerName="registry-server" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851536 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53e2103-05b5-42b9-9ca7-014f5407c290" containerName="registry-server" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.851546 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e46d8f-8f7c-4521-8b88-879aa4a93236" containerName="container-00" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.852268 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.855174 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-24gnb"/"default-dockercfg-4zng4" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.935776 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9l4\" (UniqueName: \"kubernetes.io/projected/7956e635-d03d-4293-bdd1-1809ada6fb1d-kube-api-access-9h9l4\") pod \"crc-debug-jsjww\" (UID: \"7956e635-d03d-4293-bdd1-1809ada6fb1d\") " pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:45 crc kubenswrapper[4956]: I0930 07:02:45.935996 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7956e635-d03d-4293-bdd1-1809ada6fb1d-host\") pod \"crc-debug-jsjww\" (UID: \"7956e635-d03d-4293-bdd1-1809ada6fb1d\") " pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:46 crc kubenswrapper[4956]: I0930 07:02:46.038647 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9l4\" (UniqueName: \"kubernetes.io/projected/7956e635-d03d-4293-bdd1-1809ada6fb1d-kube-api-access-9h9l4\") pod \"crc-debug-jsjww\" (UID: \"7956e635-d03d-4293-bdd1-1809ada6fb1d\") " pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:46 crc kubenswrapper[4956]: I0930 07:02:46.038853 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7956e635-d03d-4293-bdd1-1809ada6fb1d-host\") pod \"crc-debug-jsjww\" (UID: \"7956e635-d03d-4293-bdd1-1809ada6fb1d\") " pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:46 crc kubenswrapper[4956]: I0930 07:02:46.038949 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7956e635-d03d-4293-bdd1-1809ada6fb1d-host\") pod \"crc-debug-jsjww\" (UID: \"7956e635-d03d-4293-bdd1-1809ada6fb1d\") " pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:46 crc kubenswrapper[4956]: I0930 07:02:46.072402 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9l4\" (UniqueName: \"kubernetes.io/projected/7956e635-d03d-4293-bdd1-1809ada6fb1d-kube-api-access-9h9l4\") pod \"crc-debug-jsjww\" (UID: \"7956e635-d03d-4293-bdd1-1809ada6fb1d\") " pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:46 crc kubenswrapper[4956]: I0930 07:02:46.173093 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:46 crc kubenswrapper[4956]: I0930 07:02:46.357190 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e46d8f-8f7c-4521-8b88-879aa4a93236" path="/var/lib/kubelet/pods/e4e46d8f-8f7c-4521-8b88-879aa4a93236/volumes" Sep 30 07:02:46 crc kubenswrapper[4956]: I0930 07:02:46.407436 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/crc-debug-jsjww" event={"ID":"7956e635-d03d-4293-bdd1-1809ada6fb1d","Type":"ContainerStarted","Data":"86d1feaac31676be7824357f9308e65abb24b4539cde420610607687ce3d63a7"} Sep 30 07:02:47 crc kubenswrapper[4956]: I0930 07:02:47.428776 4956 generic.go:334] "Generic (PLEG): container finished" podID="7956e635-d03d-4293-bdd1-1809ada6fb1d" containerID="c43ae9940271b9d78188c0b04f0a2c933a0cccdc944130e7224cfc5a624aaad2" exitCode=0 Sep 30 07:02:47 crc kubenswrapper[4956]: I0930 07:02:47.428852 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/crc-debug-jsjww" event={"ID":"7956e635-d03d-4293-bdd1-1809ada6fb1d","Type":"ContainerDied","Data":"c43ae9940271b9d78188c0b04f0a2c933a0cccdc944130e7224cfc5a624aaad2"} Sep 30 07:02:48 crc kubenswrapper[4956]: I0930 07:02:48.552612 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:48 crc kubenswrapper[4956]: I0930 07:02:48.698912 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h9l4\" (UniqueName: \"kubernetes.io/projected/7956e635-d03d-4293-bdd1-1809ada6fb1d-kube-api-access-9h9l4\") pod \"7956e635-d03d-4293-bdd1-1809ada6fb1d\" (UID: \"7956e635-d03d-4293-bdd1-1809ada6fb1d\") " Sep 30 07:02:48 crc kubenswrapper[4956]: I0930 07:02:48.699092 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7956e635-d03d-4293-bdd1-1809ada6fb1d-host\") pod \"7956e635-d03d-4293-bdd1-1809ada6fb1d\" (UID: \"7956e635-d03d-4293-bdd1-1809ada6fb1d\") " Sep 30 07:02:48 crc kubenswrapper[4956]: I0930 07:02:48.699807 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7956e635-d03d-4293-bdd1-1809ada6fb1d-host" (OuterVolumeSpecName: "host") pod "7956e635-d03d-4293-bdd1-1809ada6fb1d" (UID: "7956e635-d03d-4293-bdd1-1809ada6fb1d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:02:48 crc kubenswrapper[4956]: I0930 07:02:48.718864 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7956e635-d03d-4293-bdd1-1809ada6fb1d-kube-api-access-9h9l4" (OuterVolumeSpecName: "kube-api-access-9h9l4") pod "7956e635-d03d-4293-bdd1-1809ada6fb1d" (UID: "7956e635-d03d-4293-bdd1-1809ada6fb1d"). InnerVolumeSpecName "kube-api-access-9h9l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:02:48 crc kubenswrapper[4956]: I0930 07:02:48.801155 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h9l4\" (UniqueName: \"kubernetes.io/projected/7956e635-d03d-4293-bdd1-1809ada6fb1d-kube-api-access-9h9l4\") on node \"crc\" DevicePath \"\"" Sep 30 07:02:48 crc kubenswrapper[4956]: I0930 07:02:48.801184 4956 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7956e635-d03d-4293-bdd1-1809ada6fb1d-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:02:49 crc kubenswrapper[4956]: I0930 07:02:49.447939 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/crc-debug-jsjww" event={"ID":"7956e635-d03d-4293-bdd1-1809ada6fb1d","Type":"ContainerDied","Data":"86d1feaac31676be7824357f9308e65abb24b4539cde420610607687ce3d63a7"} Sep 30 07:02:49 crc kubenswrapper[4956]: I0930 07:02:49.447981 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86d1feaac31676be7824357f9308e65abb24b4539cde420610607687ce3d63a7" Sep 30 07:02:49 crc kubenswrapper[4956]: I0930 07:02:49.448034 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-jsjww" Sep 30 07:02:56 crc kubenswrapper[4956]: I0930 07:02:56.778450 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-24gnb/crc-debug-jsjww"] Sep 30 07:02:56 crc kubenswrapper[4956]: I0930 07:02:56.790356 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-24gnb/crc-debug-jsjww"] Sep 30 07:02:57 crc kubenswrapper[4956]: I0930 07:02:57.342177 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:02:57 crc kubenswrapper[4956]: E0930 07:02:57.342849 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:02:57 crc kubenswrapper[4956]: I0930 07:02:57.977642 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-24gnb/crc-debug-l76r5"] Sep 30 07:02:57 crc kubenswrapper[4956]: E0930 07:02:57.978245 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7956e635-d03d-4293-bdd1-1809ada6fb1d" containerName="container-00" Sep 30 07:02:57 crc kubenswrapper[4956]: I0930 07:02:57.978262 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="7956e635-d03d-4293-bdd1-1809ada6fb1d" containerName="container-00" Sep 30 07:02:57 crc kubenswrapper[4956]: I0930 07:02:57.978562 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="7956e635-d03d-4293-bdd1-1809ada6fb1d" containerName="container-00" Sep 30 07:02:57 crc kubenswrapper[4956]: I0930 07:02:57.979505 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:02:57 crc kubenswrapper[4956]: I0930 07:02:57.982750 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-24gnb"/"default-dockercfg-4zng4" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.080359 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn4q4\" (UniqueName: \"kubernetes.io/projected/21703590-752c-4860-ac69-32674ee4a9cf-kube-api-access-zn4q4\") pod \"crc-debug-l76r5\" (UID: \"21703590-752c-4860-ac69-32674ee4a9cf\") " pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.080917 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21703590-752c-4860-ac69-32674ee4a9cf-host\") pod \"crc-debug-l76r5\" (UID: \"21703590-752c-4860-ac69-32674ee4a9cf\") " pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.182779 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21703590-752c-4860-ac69-32674ee4a9cf-host\") pod \"crc-debug-l76r5\" (UID: \"21703590-752c-4860-ac69-32674ee4a9cf\") " pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.182964 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21703590-752c-4860-ac69-32674ee4a9cf-host\") pod \"crc-debug-l76r5\" (UID: \"21703590-752c-4860-ac69-32674ee4a9cf\") " pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.183227 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn4q4\" (UniqueName: \"kubernetes.io/projected/21703590-752c-4860-ac69-32674ee4a9cf-kube-api-access-zn4q4\") pod \"crc-debug-l76r5\" (UID: \"21703590-752c-4860-ac69-32674ee4a9cf\") " pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.199886 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn4q4\" (UniqueName: \"kubernetes.io/projected/21703590-752c-4860-ac69-32674ee4a9cf-kube-api-access-zn4q4\") pod \"crc-debug-l76r5\" (UID: \"21703590-752c-4860-ac69-32674ee4a9cf\") " pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.304298 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.357511 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7956e635-d03d-4293-bdd1-1809ada6fb1d" path="/var/lib/kubelet/pods/7956e635-d03d-4293-bdd1-1809ada6fb1d/volumes" Sep 30 07:02:58 crc kubenswrapper[4956]: I0930 07:02:58.526707 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/crc-debug-l76r5" event={"ID":"21703590-752c-4860-ac69-32674ee4a9cf","Type":"ContainerStarted","Data":"cdf73d15af78198beb10bd61cd00953eeff28add2461218c6cddc616d2ac1cdd"} Sep 30 07:02:59 crc kubenswrapper[4956]: I0930 07:02:59.537915 4956 generic.go:334] "Generic (PLEG): container finished" podID="21703590-752c-4860-ac69-32674ee4a9cf" containerID="2e65853d5bdf6fa59ad964e54bb155d24caf1fc5cd51dfe132ae96806763b5a0" exitCode=0 Sep 30 07:02:59 crc kubenswrapper[4956]: I0930 07:02:59.538068 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/crc-debug-l76r5" event={"ID":"21703590-752c-4860-ac69-32674ee4a9cf","Type":"ContainerDied","Data":"2e65853d5bdf6fa59ad964e54bb155d24caf1fc5cd51dfe132ae96806763b5a0"} Sep 30 07:02:59 crc kubenswrapper[4956]: I0930 07:02:59.596455 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-24gnb/crc-debug-l76r5"] Sep 30 07:02:59 crc kubenswrapper[4956]: I0930 07:02:59.607581 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-24gnb/crc-debug-l76r5"] Sep 30 07:03:00 crc kubenswrapper[4956]: I0930 07:03:00.655596 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:03:00 crc kubenswrapper[4956]: I0930 07:03:00.757991 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21703590-752c-4860-ac69-32674ee4a9cf-host\") pod \"21703590-752c-4860-ac69-32674ee4a9cf\" (UID: \"21703590-752c-4860-ac69-32674ee4a9cf\") " Sep 30 07:03:00 crc kubenswrapper[4956]: I0930 07:03:00.758170 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn4q4\" (UniqueName: \"kubernetes.io/projected/21703590-752c-4860-ac69-32674ee4a9cf-kube-api-access-zn4q4\") pod \"21703590-752c-4860-ac69-32674ee4a9cf\" (UID: \"21703590-752c-4860-ac69-32674ee4a9cf\") " Sep 30 07:03:00 crc kubenswrapper[4956]: I0930 07:03:00.758385 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21703590-752c-4860-ac69-32674ee4a9cf-host" (OuterVolumeSpecName: "host") pod "21703590-752c-4860-ac69-32674ee4a9cf" (UID: "21703590-752c-4860-ac69-32674ee4a9cf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:03:00 crc kubenswrapper[4956]: I0930 07:03:00.758790 4956 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21703590-752c-4860-ac69-32674ee4a9cf-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:00 crc kubenswrapper[4956]: I0930 07:03:00.774312 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21703590-752c-4860-ac69-32674ee4a9cf-kube-api-access-zn4q4" (OuterVolumeSpecName: "kube-api-access-zn4q4") pod "21703590-752c-4860-ac69-32674ee4a9cf" (UID: "21703590-752c-4860-ac69-32674ee4a9cf"). InnerVolumeSpecName "kube-api-access-zn4q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:03:00 crc kubenswrapper[4956]: I0930 07:03:00.860541 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn4q4\" (UniqueName: \"kubernetes.io/projected/21703590-752c-4860-ac69-32674ee4a9cf-kube-api-access-zn4q4\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.338799 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/util/0.log" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.524763 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/pull/0.log" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.554733 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/util/0.log" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.558435 4956 scope.go:117] "RemoveContainer" containerID="2e65853d5bdf6fa59ad964e54bb155d24caf1fc5cd51dfe132ae96806763b5a0" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.558466 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/crc-debug-l76r5" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.583876 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/pull/0.log" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.749653 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/pull/0.log" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.751257 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/util/0.log" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.764350 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/extract/0.log" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.982734 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-mrz5k_e8df7824-e9a8-4794-bb91-411ae6639639/kube-rbac-proxy/0.log" Sep 30 07:03:01 crc kubenswrapper[4956]: I0930 07:03:01.988562 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-mrz5k_e8df7824-e9a8-4794-bb91-411ae6639639/manager/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.045803 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-hhv4p_3c1366b7-aa64-4089-a853-e2027658e237/kube-rbac-proxy/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.228986 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-hhv4p_3c1366b7-aa64-4089-a853-e2027658e237/manager/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.232299 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-nghm2_ffab19aa-8b8f-4067-b19c-3ccd9352cb12/kube-rbac-proxy/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.251798 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-nghm2_ffab19aa-8b8f-4067-b19c-3ccd9352cb12/manager/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.353651 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21703590-752c-4860-ac69-32674ee4a9cf" path="/var/lib/kubelet/pods/21703590-752c-4860-ac69-32674ee4a9cf/volumes" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.414369 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-kvfmc_8ce74c21-dde5-40bb-8c42-96e4165b8541/kube-rbac-proxy/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.587996 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-kvfmc_8ce74c21-dde5-40bb-8c42-96e4165b8541/manager/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.602856 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-mklbg_c1571b7d-f7d4-470d-90ac-d276a39ea2b1/kube-rbac-proxy/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.643785 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-mklbg_c1571b7d-f7d4-470d-90ac-d276a39ea2b1/manager/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.866581 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-ncr6h_8c84e3e7-f42f-46df-af16-516bc2cac4a0/manager/0.log" Sep 30 07:03:02 crc kubenswrapper[4956]: I0930 07:03:02.881015 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-ncr6h_8c84e3e7-f42f-46df-af16-516bc2cac4a0/kube-rbac-proxy/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.177659 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c7d9477-g85xg_89ba53cc-155b-485b-926c-83eaa0772764/kube-rbac-proxy/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.270089 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f589bc7f7-jvf6n_6ee14caa-a939-467a-bdbb-4160d336eaee/kube-rbac-proxy/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.412747 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f589bc7f7-jvf6n_6ee14caa-a939-467a-bdbb-4160d336eaee/manager/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.432183 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c7d9477-g85xg_89ba53cc-155b-485b-926c-83eaa0772764/manager/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.518128 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-6hssf_101087b5-cd1e-40f3-916f-5e8f5354ac2d/kube-rbac-proxy/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.701069 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-6hssf_101087b5-cd1e-40f3-916f-5e8f5354ac2d/manager/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.734975 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-m8xs8_124abac1-4adc-4a56-8d2b-241e0eb4bf57/kube-rbac-proxy/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.750675 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-m8xs8_124abac1-4adc-4a56-8d2b-241e0eb4bf57/manager/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.893717 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-nhj4d_1ed82f79-3f95-4293-937a-f5d82ce37f10/kube-rbac-proxy/0.log" Sep 30 07:03:03 crc kubenswrapper[4956]: I0930 07:03:03.929792 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-nhj4d_1ed82f79-3f95-4293-937a-f5d82ce37f10/manager/0.log" Sep 30 07:03:04 crc kubenswrapper[4956]: I0930 07:03:04.093700 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b96467f46-5x2c8_ca55e873-96fb-4348-ba98-58ab9648de78/kube-rbac-proxy/0.log" Sep 30 07:03:04 crc kubenswrapper[4956]: I0930 07:03:04.150532 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b96467f46-5x2c8_ca55e873-96fb-4348-ba98-58ab9648de78/manager/0.log" Sep 30 07:03:04 crc kubenswrapper[4956]: I0930 07:03:04.209688 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79f9fc9fd8-j79gf_cbc3fab0-8876-49a7-a85f-4844e253595f/kube-rbac-proxy/0.log" Sep 30 07:03:04 crc kubenswrapper[4956]: I0930 07:03:04.380375 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fb7d6b8bf-2rcv8_c07dffb1-ebd0-44e9-8061-ce680870aba3/kube-rbac-proxy/0.log" Sep 30 07:03:04 crc kubenswrapper[4956]: I0930 07:03:04.397101 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79f9fc9fd8-j79gf_cbc3fab0-8876-49a7-a85f-4844e253595f/manager/0.log" Sep 30 07:03:04 crc kubenswrapper[4956]: I0930 07:03:04.440799 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fb7d6b8bf-2rcv8_c07dffb1-ebd0-44e9-8061-ce680870aba3/manager/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.142521 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82_0b6d8a4b-faca-4779-be46-219d3c0a3e22/kube-rbac-proxy/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.168463 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b7bb8bd67-7nhwl_2e369192-5374-4d18-954d-7d46ff60e9c1/kube-rbac-proxy/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.188387 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82_0b6d8a4b-faca-4779-be46-219d3c0a3e22/manager/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.386514 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56dc567787-c5lxf_3a68c8b3-216d-4ba4-b841-054d52526caf/kube-rbac-proxy/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.579732 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56dc567787-c5lxf_3a68c8b3-216d-4ba4-b841-054d52526caf/operator/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.637962 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-v888v_c0c96af5-0c02-4dfd-91e5-947696cb4899/kube-rbac-proxy/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.711978 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4bg6n_dbd8b6e7-61f4-4a07-92ab-ed42a432df93/registry-server/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.911337 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-v888v_c0c96af5-0c02-4dfd-91e5-947696cb4899/manager/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.919497 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-xh7wx_69909a1b-9121-45ae-aaeb-e63950300ec9/kube-rbac-proxy/0.log" Sep 30 07:03:05 crc kubenswrapper[4956]: I0930 07:03:05.987421 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-xh7wx_69909a1b-9121-45ae-aaeb-e63950300ec9/manager/0.log" Sep 30 07:03:06 crc kubenswrapper[4956]: I0930 07:03:06.178006 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-657c6b68c7-6hgwr_f93913e6-5d74-4030-ac26-a10781a72db0/kube-rbac-proxy/0.log" Sep 30 07:03:06 crc kubenswrapper[4956]: I0930 07:03:06.194333 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-j2hpj_49d67534-20e0-48be-9614-eec49889c4a7/operator/0.log" Sep 30 07:03:06 crc kubenswrapper[4956]: I0930 07:03:06.342431 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-657c6b68c7-6hgwr_f93913e6-5d74-4030-ac26-a10781a72db0/manager/0.log" Sep 30 07:03:06 crc kubenswrapper[4956]: I0930 07:03:06.441510 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-wxgr6_9cdfca4b-0805-4ebc-92e1-906044d82e4b/kube-rbac-proxy/0.log" Sep 30 07:03:06 crc kubenswrapper[4956]: I0930 07:03:06.810664 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-wxgr6_9cdfca4b-0805-4ebc-92e1-906044d82e4b/manager/0.log" Sep 30 07:03:06 crc kubenswrapper[4956]: I0930 07:03:06.915083 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b7bb8bd67-7nhwl_2e369192-5374-4d18-954d-7d46ff60e9c1/manager/0.log" Sep 30 07:03:07 crc kubenswrapper[4956]: I0930 07:03:07.050347 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb97fcf96-jz8bg_2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f/manager/0.log" Sep 30 07:03:07 crc kubenswrapper[4956]: I0930 07:03:07.055690 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb97fcf96-jz8bg_2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f/kube-rbac-proxy/0.log" Sep 30 07:03:07 crc kubenswrapper[4956]: I0930 07:03:07.111091 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75756dd4d9-sbn5x_f620cf06-9ba1-4866-9964-dc38e574c889/kube-rbac-proxy/0.log" Sep 30 07:03:07 crc kubenswrapper[4956]: I0930 07:03:07.171945 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75756dd4d9-sbn5x_f620cf06-9ba1-4866-9964-dc38e574c889/manager/0.log" Sep 30 07:03:11 crc kubenswrapper[4956]: I0930 07:03:11.341785 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:03:11 crc kubenswrapper[4956]: E0930 07:03:11.342690 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:03:22 crc kubenswrapper[4956]: I0930 07:03:22.969541 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pclsc_aa301454-b9a6-4bed-acd1-f2cf109b5259/control-plane-machine-set-operator/0.log" Sep 30 07:03:23 crc kubenswrapper[4956]: I0930 07:03:23.138934 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wjwhf_9ec8ec3c-f0d3-41b1-a311-2eca015cd63a/kube-rbac-proxy/0.log" Sep 30 07:03:23 crc kubenswrapper[4956]: I0930 07:03:23.150989 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wjwhf_9ec8ec3c-f0d3-41b1-a311-2eca015cd63a/machine-api-operator/0.log" Sep 30 07:03:24 crc kubenswrapper[4956]: I0930 07:03:24.341462 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:03:24 crc kubenswrapper[4956]: E0930 07:03:24.341741 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:03:35 crc kubenswrapper[4956]: I0930 07:03:35.447615 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-skkpn_528b17e7-a42f-4e5e-8731-4f3d84d59cf7/cert-manager-controller/0.log" Sep 30 07:03:35 crc kubenswrapper[4956]: I0930 07:03:35.678860 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-9jcp8_5384e744-0e0a-4743-bf15-cb75c35951ac/cert-manager-cainjector/0.log" Sep 30 07:03:35 crc kubenswrapper[4956]: I0930 07:03:35.738801 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-cmwsp_50748964-4222-40d7-a12c-6ab004bf8a77/cert-manager-webhook/0.log" Sep 30 07:03:39 crc kubenswrapper[4956]: I0930 07:03:39.341346 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:03:39 crc kubenswrapper[4956]: E0930 07:03:39.342127 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:03:48 crc kubenswrapper[4956]: I0930 07:03:48.067451 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-svf7v_7a6e9183-4fbf-4549-926d-d8a48c0d17ac/nmstate-console-plugin/0.log" Sep 30 07:03:48 crc kubenswrapper[4956]: I0930 07:03:48.286761 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dxkwj_1f61456f-3dcb-4831-9760-c06143ec9b14/nmstate-handler/0.log" Sep 30 07:03:48 crc kubenswrapper[4956]: I0930 07:03:48.320616 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-cfj2d_5c73bb61-b711-4efb-8ba7-de118a9b30e7/kube-rbac-proxy/0.log" Sep 30 07:03:48 crc kubenswrapper[4956]: I0930 07:03:48.344577 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-cfj2d_5c73bb61-b711-4efb-8ba7-de118a9b30e7/nmstate-metrics/0.log" Sep 30 07:03:48 crc kubenswrapper[4956]: I0930 07:03:48.482616 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-txhvp_34203d19-cae9-4ef7-863e-03f524e1a662/nmstate-operator/0.log" Sep 30 07:03:48 crc kubenswrapper[4956]: I0930 07:03:48.532663 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-vsdnt_8afb5dc0-a1aa-4c21-95b1-62c64b452ff1/nmstate-webhook/0.log" Sep 30 07:03:52 crc kubenswrapper[4956]: I0930 07:03:52.346042 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:03:52 crc kubenswrapper[4956]: E0930 07:03:52.346973 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.170507 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ntsz7_0f07b182-685f-40c3-961e-eebfbf2d5fe5/kube-rbac-proxy/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.341175 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:04:03 crc kubenswrapper[4956]: E0930 07:04:03.341439 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.379083 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-frr-files/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.413269 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ntsz7_0f07b182-685f-40c3-961e-eebfbf2d5fe5/controller/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.518910 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-reloader/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.577474 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-frr-files/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.591230 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-reloader/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.597502 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-metrics/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.797328 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-frr-files/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.824585 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-metrics/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.847844 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-metrics/0.log" Sep 30 07:04:03 crc kubenswrapper[4956]: I0930 07:04:03.862713 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-reloader/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.062582 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/controller/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.070726 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-frr-files/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.070882 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-metrics/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.078778 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-reloader/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.241733 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/kube-rbac-proxy/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.299715 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/frr-metrics/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.318700 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/kube-rbac-proxy-frr/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.458472 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/reloader/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.607688 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-7r8bt_26860b1e-eab3-4a86-b87d-3c52529f70dd/frr-k8s-webhook-server/0.log" Sep 30 07:04:04 crc kubenswrapper[4956]: I0930 07:04:04.789112 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7478c46d8f-h6xxf_1cc7609e-41b3-4fb2-98f4-6cc743299a2f/manager/0.log" Sep 30 07:04:05 crc kubenswrapper[4956]: I0930 07:04:05.069155 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6bbf45b88f-6bpm6_889c063f-2550-48ed-957c-150f8f1192e3/webhook-server/0.log" Sep 30 07:04:05 crc kubenswrapper[4956]: I0930 07:04:05.282575 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5pvg9_c59b762c-1f05-46c5-8d6d-2bf39a8592f0/kube-rbac-proxy/0.log" Sep 30 07:04:05 crc kubenswrapper[4956]: I0930 07:04:05.526106 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/frr/0.log" Sep 30 07:04:05 crc kubenswrapper[4956]: I0930 07:04:05.782040 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5pvg9_c59b762c-1f05-46c5-8d6d-2bf39a8592f0/speaker/0.log" Sep 30 07:04:16 crc kubenswrapper[4956]: I0930 07:04:16.341606 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:04:16 crc kubenswrapper[4956]: E0930 07:04:16.342338 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:04:17 crc kubenswrapper[4956]: I0930 07:04:17.956900 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/util/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.139977 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/pull/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.155387 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/util/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.371091 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/pull/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.510403 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/util/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.522336 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/extract/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.530585 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/pull/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.679062 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/util/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.834376 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/util/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.846717 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/pull/0.log" Sep 30 07:04:18 crc kubenswrapper[4956]: I0930 07:04:18.888250 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/pull/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.039945 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/pull/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.062947 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/extract/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.067316 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/util/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.243979 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-utilities/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.436592 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-content/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.466340 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-utilities/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.473073 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-content/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.660146 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-content/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.667571 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-utilities/0.log" Sep 30 07:04:19 crc kubenswrapper[4956]: I0930 07:04:19.935019 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-utilities/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.135589 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-utilities/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.151907 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-content/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.207000 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-content/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.358349 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-content/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.364612 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/registry-server/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.418154 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-utilities/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.551959 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/util/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.838389 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/util/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.861066 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/pull/0.log" Sep 30 07:04:20 crc kubenswrapper[4956]: I0930 07:04:20.906656 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/pull/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.233476 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/registry-server/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.256522 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/util/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.276885 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/pull/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.277985 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/extract/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.468414 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2zl8q_747a9b33-025d-4b52-9b54-7d1b829c6cef/marketplace-operator/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.502625 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-utilities/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.727402 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-content/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.729162 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-utilities/0.log" Sep 30 07:04:21 crc kubenswrapper[4956]: I0930 07:04:21.758891 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-content/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.135079 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-utilities/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.143374 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-utilities/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.161656 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-content/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.382028 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/registry-server/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.414827 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-utilities/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.438461 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-content/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.485732 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-content/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.630560 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-content/0.log" Sep 30 07:04:22 crc kubenswrapper[4956]: I0930 07:04:22.695064 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-utilities/0.log" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.600664 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/registry-server/0.log" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.673554 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-489nq"] Sep 30 07:04:23 crc kubenswrapper[4956]: E0930 07:04:23.674008 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21703590-752c-4860-ac69-32674ee4a9cf" containerName="container-00" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.674025 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="21703590-752c-4860-ac69-32674ee4a9cf" containerName="container-00" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.674252 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="21703590-752c-4860-ac69-32674ee4a9cf" containerName="container-00" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.676044 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.686422 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-489nq"] Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.771094 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-catalog-content\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.771172 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-utilities\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.771219 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcm5w\" (UniqueName: \"kubernetes.io/projected/63c5a711-65de-4a6a-b144-1c6710406fde-kube-api-access-tcm5w\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.873375 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-catalog-content\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.873741 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-utilities\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.874169 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcm5w\" (UniqueName: \"kubernetes.io/projected/63c5a711-65de-4a6a-b144-1c6710406fde-kube-api-access-tcm5w\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.874128 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-utilities\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.873850 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-catalog-content\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.907067 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcm5w\" (UniqueName: \"kubernetes.io/projected/63c5a711-65de-4a6a-b144-1c6710406fde-kube-api-access-tcm5w\") pod \"certified-operators-489nq\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:23 crc kubenswrapper[4956]: I0930 07:04:23.995945 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:24 crc kubenswrapper[4956]: I0930 07:04:24.591409 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-489nq"] Sep 30 07:04:25 crc kubenswrapper[4956]: I0930 07:04:25.422928 4956 generic.go:334] "Generic (PLEG): container finished" podID="63c5a711-65de-4a6a-b144-1c6710406fde" containerID="8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167" exitCode=0 Sep 30 07:04:25 crc kubenswrapper[4956]: I0930 07:04:25.423036 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489nq" event={"ID":"63c5a711-65de-4a6a-b144-1c6710406fde","Type":"ContainerDied","Data":"8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167"} Sep 30 07:04:25 crc kubenswrapper[4956]: I0930 07:04:25.423261 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489nq" event={"ID":"63c5a711-65de-4a6a-b144-1c6710406fde","Type":"ContainerStarted","Data":"92e38ab0f0410d861476c2081206dbc03b857f9244b27280fecc8c02f75522bc"} Sep 30 07:04:25 crc kubenswrapper[4956]: I0930 07:04:25.425184 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:04:26 crc kubenswrapper[4956]: I0930 07:04:26.433146 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489nq" event={"ID":"63c5a711-65de-4a6a-b144-1c6710406fde","Type":"ContainerStarted","Data":"6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465"} Sep 30 07:04:27 crc kubenswrapper[4956]: I0930 07:04:27.442598 4956 generic.go:334] "Generic (PLEG): container finished" podID="63c5a711-65de-4a6a-b144-1c6710406fde" containerID="6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465" exitCode=0 Sep 30 07:04:27 crc kubenswrapper[4956]: I0930 07:04:27.442712 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489nq" event={"ID":"63c5a711-65de-4a6a-b144-1c6710406fde","Type":"ContainerDied","Data":"6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465"} Sep 30 07:04:28 crc kubenswrapper[4956]: I0930 07:04:28.453662 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489nq" event={"ID":"63c5a711-65de-4a6a-b144-1c6710406fde","Type":"ContainerStarted","Data":"0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7"} Sep 30 07:04:28 crc kubenswrapper[4956]: I0930 07:04:28.472244 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-489nq" podStartSLOduration=2.9028624130000003 podStartE2EDuration="5.472227183s" podCreationTimestamp="2025-09-30 07:04:23 +0000 UTC" firstStartedPulling="2025-09-30 07:04:25.42491854 +0000 UTC m=+5735.752039065" lastFinishedPulling="2025-09-30 07:04:27.99428332 +0000 UTC m=+5738.321403835" observedRunningTime="2025-09-30 07:04:28.467890067 +0000 UTC m=+5738.795010622" watchObservedRunningTime="2025-09-30 07:04:28.472227183 +0000 UTC m=+5738.799347708" Sep 30 07:04:29 crc kubenswrapper[4956]: I0930 07:04:29.342596 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:04:29 crc kubenswrapper[4956]: E0930 07:04:29.342839 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:04:33 crc kubenswrapper[4956]: I0930 07:04:33.996999 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:33 crc kubenswrapper[4956]: I0930 07:04:33.997546 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:34 crc kubenswrapper[4956]: I0930 07:04:34.048880 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:34 crc kubenswrapper[4956]: I0930 07:04:34.553713 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:34 crc kubenswrapper[4956]: I0930 07:04:34.607093 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-489nq"] Sep 30 07:04:35 crc kubenswrapper[4956]: I0930 07:04:35.361958 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-vl7fv_45c3115a-12d7-4cd7-83a8-f9a720e63ce6/prometheus-operator/0.log" Sep 30 07:04:35 crc kubenswrapper[4956]: I0930 07:04:35.534667 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5_0dd88442-b29e-47e9-b221-57ac09bbc7cb/prometheus-operator-admission-webhook/0.log" Sep 30 07:04:35 crc kubenswrapper[4956]: I0930 07:04:35.558333 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b_868bdc73-4a3b-49ec-9676-0d98a950e1ed/prometheus-operator-admission-webhook/0.log" Sep 30 07:04:35 crc kubenswrapper[4956]: I0930 07:04:35.737978 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-pv5cs_f9203661-7a5b-45cd-9057-78b70739a89b/operator/0.log" Sep 30 07:04:35 crc kubenswrapper[4956]: I0930 07:04:35.773462 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-j6v4z_d32f519a-014e-43e2-b715-78e9fd9197c3/perses-operator/0.log" Sep 30 07:04:36 crc kubenswrapper[4956]: I0930 07:04:36.525865 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-489nq" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" containerName="registry-server" containerID="cri-o://0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7" gracePeriod=2 Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.172876 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.349542 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-utilities\") pod \"63c5a711-65de-4a6a-b144-1c6710406fde\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.350061 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcm5w\" (UniqueName: \"kubernetes.io/projected/63c5a711-65de-4a6a-b144-1c6710406fde-kube-api-access-tcm5w\") pod \"63c5a711-65de-4a6a-b144-1c6710406fde\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.350174 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-catalog-content\") pod \"63c5a711-65de-4a6a-b144-1c6710406fde\" (UID: \"63c5a711-65de-4a6a-b144-1c6710406fde\") " Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.350479 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-utilities" (OuterVolumeSpecName: "utilities") pod "63c5a711-65de-4a6a-b144-1c6710406fde" (UID: "63c5a711-65de-4a6a-b144-1c6710406fde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.351230 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.358487 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c5a711-65de-4a6a-b144-1c6710406fde-kube-api-access-tcm5w" (OuterVolumeSpecName: "kube-api-access-tcm5w") pod "63c5a711-65de-4a6a-b144-1c6710406fde" (UID: "63c5a711-65de-4a6a-b144-1c6710406fde"). InnerVolumeSpecName "kube-api-access-tcm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.393722 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63c5a711-65de-4a6a-b144-1c6710406fde" (UID: "63c5a711-65de-4a6a-b144-1c6710406fde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.453608 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcm5w\" (UniqueName: \"kubernetes.io/projected/63c5a711-65de-4a6a-b144-1c6710406fde-kube-api-access-tcm5w\") on node \"crc\" DevicePath \"\"" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.453933 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c5a711-65de-4a6a-b144-1c6710406fde-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.536823 4956 generic.go:334] "Generic (PLEG): container finished" podID="63c5a711-65de-4a6a-b144-1c6710406fde" containerID="0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7" exitCode=0 Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.537682 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489nq" event={"ID":"63c5a711-65de-4a6a-b144-1c6710406fde","Type":"ContainerDied","Data":"0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7"} Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.537768 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489nq" event={"ID":"63c5a711-65de-4a6a-b144-1c6710406fde","Type":"ContainerDied","Data":"92e38ab0f0410d861476c2081206dbc03b857f9244b27280fecc8c02f75522bc"} Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.537850 4956 scope.go:117] "RemoveContainer" containerID="0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.538052 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-489nq" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.576992 4956 scope.go:117] "RemoveContainer" containerID="6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.577803 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-489nq"] Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.587526 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-489nq"] Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.595067 4956 scope.go:117] "RemoveContainer" containerID="8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.636075 4956 scope.go:117] "RemoveContainer" containerID="0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7" Sep 30 07:04:37 crc kubenswrapper[4956]: E0930 07:04:37.636621 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7\": container with ID starting with 0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7 not found: ID does not exist" containerID="0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.636658 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7"} err="failed to get container status \"0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7\": rpc error: code = NotFound desc = could not find container \"0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7\": container with ID starting with 0eb2b9d10fe02fd9a18e6d096f324757871f5a30a6df5e77187ffb0c0109c1d7 not found: ID does not exist" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.636686 4956 scope.go:117] "RemoveContainer" containerID="6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465" Sep 30 07:04:37 crc kubenswrapper[4956]: E0930 07:04:37.636987 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465\": container with ID starting with 6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465 not found: ID does not exist" containerID="6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.637041 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465"} err="failed to get container status \"6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465\": rpc error: code = NotFound desc = could not find container \"6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465\": container with ID starting with 6f5e19e3f5b8faaff2fa1a7236b5d5ba0918b16d1921024287938c4319e17465 not found: ID does not exist" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.637067 4956 scope.go:117] "RemoveContainer" containerID="8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167" Sep 30 07:04:37 crc kubenswrapper[4956]: E0930 07:04:37.637462 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167\": container with ID starting with 8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167 not found: ID does not exist" containerID="8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167" Sep 30 07:04:37 crc kubenswrapper[4956]: I0930 07:04:37.637486 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167"} err="failed to get container status \"8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167\": rpc error: code = NotFound desc = could not find container \"8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167\": container with ID starting with 8733a8c69151278fcc750d0a1c973e1a79fc1d1d02a03984108aec6e07eb5167 not found: ID does not exist" Sep 30 07:04:38 crc kubenswrapper[4956]: I0930 07:04:38.355937 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" path="/var/lib/kubelet/pods/63c5a711-65de-4a6a-b144-1c6710406fde/volumes" Sep 30 07:04:43 crc kubenswrapper[4956]: I0930 07:04:43.341712 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:04:43 crc kubenswrapper[4956]: E0930 07:04:43.343419 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:04:55 crc kubenswrapper[4956]: I0930 07:04:55.342197 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:04:55 crc kubenswrapper[4956]: E0930 07:04:55.343294 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:05:08 crc kubenswrapper[4956]: I0930 07:05:08.352663 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:05:08 crc kubenswrapper[4956]: E0930 07:05:08.356306 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:05:23 crc kubenswrapper[4956]: I0930 07:05:23.341337 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:05:23 crc kubenswrapper[4956]: E0930 07:05:23.343337 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:05:35 crc kubenswrapper[4956]: I0930 07:05:35.343298 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:05:35 crc kubenswrapper[4956]: E0930 07:05:35.344989 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:05:47 crc kubenswrapper[4956]: I0930 07:05:47.341535 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:05:47 crc kubenswrapper[4956]: E0930 07:05:47.342220 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:06:01 crc kubenswrapper[4956]: I0930 07:06:01.342036 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:06:01 crc kubenswrapper[4956]: E0930 07:06:01.342882 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:06:16 crc kubenswrapper[4956]: I0930 07:06:16.341277 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:06:16 crc kubenswrapper[4956]: E0930 07:06:16.341940 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:06:18 crc kubenswrapper[4956]: I0930 07:06:18.474706 4956 scope.go:117] "RemoveContainer" containerID="78359b6f6108a72e384d82f51054651ca3b51a3fe73dbcdb16d205622ff3fad3" Sep 30 07:06:29 crc kubenswrapper[4956]: I0930 07:06:29.341047 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:06:29 crc kubenswrapper[4956]: E0930 07:06:29.341924 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:06:40 crc kubenswrapper[4956]: I0930 07:06:40.358016 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:06:40 crc kubenswrapper[4956]: E0930 07:06:40.360522 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:06:51 crc kubenswrapper[4956]: E0930 07:06:51.088091 4956 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod597bee6b_c39b_402a_bdce_6d8ac7243b62.slice/crio-89f2b471941e4cb6939a5ff07ac14e8c107cd5a9d38ad4e56afb06e65a956f6a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod597bee6b_c39b_402a_bdce_6d8ac7243b62.slice/crio-conmon-89f2b471941e4cb6939a5ff07ac14e8c107cd5a9d38ad4e56afb06e65a956f6a.scope\": RecentStats: unable to find data in memory cache]" Sep 30 07:06:51 crc kubenswrapper[4956]: I0930 07:06:51.129559 4956 generic.go:334] "Generic (PLEG): container finished" podID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerID="89f2b471941e4cb6939a5ff07ac14e8c107cd5a9d38ad4e56afb06e65a956f6a" exitCode=0 Sep 30 07:06:51 crc kubenswrapper[4956]: I0930 07:06:51.129610 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-24gnb/must-gather-lvqn6" event={"ID":"597bee6b-c39b-402a-bdce-6d8ac7243b62","Type":"ContainerDied","Data":"89f2b471941e4cb6939a5ff07ac14e8c107cd5a9d38ad4e56afb06e65a956f6a"} Sep 30 07:06:51 crc kubenswrapper[4956]: I0930 07:06:51.130093 4956 scope.go:117] "RemoveContainer" containerID="89f2b471941e4cb6939a5ff07ac14e8c107cd5a9d38ad4e56afb06e65a956f6a" Sep 30 07:06:51 crc kubenswrapper[4956]: I0930 07:06:51.341287 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:06:51 crc kubenswrapper[4956]: E0930 07:06:51.341685 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:06:51 crc kubenswrapper[4956]: I0930 07:06:51.937994 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-24gnb_must-gather-lvqn6_597bee6b-c39b-402a-bdce-6d8ac7243b62/gather/0.log" Sep 30 07:07:00 crc kubenswrapper[4956]: I0930 07:07:00.733219 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-24gnb/must-gather-lvqn6"] Sep 30 07:07:00 crc kubenswrapper[4956]: I0930 07:07:00.734148 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-24gnb/must-gather-lvqn6" podUID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerName="copy" containerID="cri-o://9c5d5a2b0292cc93f8d7d891e62a97ac199ace2d19a3d4ecccb50601318f4159" gracePeriod=2 Sep 30 07:07:00 crc kubenswrapper[4956]: I0930 07:07:00.744412 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-24gnb/must-gather-lvqn6"] Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.226487 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-24gnb_must-gather-lvqn6_597bee6b-c39b-402a-bdce-6d8ac7243b62/copy/0.log" Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.227536 4956 generic.go:334] "Generic (PLEG): container finished" podID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerID="9c5d5a2b0292cc93f8d7d891e62a97ac199ace2d19a3d4ecccb50601318f4159" exitCode=143 Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.227688 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="573c0ce1bada751bfde6fec7bc381717c260a72a873ce8389966c2c64da59854" Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.277231 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-24gnb_must-gather-lvqn6_597bee6b-c39b-402a-bdce-6d8ac7243b62/copy/0.log" Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.277849 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.367075 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjzfn\" (UniqueName: \"kubernetes.io/projected/597bee6b-c39b-402a-bdce-6d8ac7243b62-kube-api-access-qjzfn\") pod \"597bee6b-c39b-402a-bdce-6d8ac7243b62\" (UID: \"597bee6b-c39b-402a-bdce-6d8ac7243b62\") " Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.367302 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/597bee6b-c39b-402a-bdce-6d8ac7243b62-must-gather-output\") pod \"597bee6b-c39b-402a-bdce-6d8ac7243b62\" (UID: \"597bee6b-c39b-402a-bdce-6d8ac7243b62\") " Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.373199 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/597bee6b-c39b-402a-bdce-6d8ac7243b62-kube-api-access-qjzfn" (OuterVolumeSpecName: "kube-api-access-qjzfn") pod "597bee6b-c39b-402a-bdce-6d8ac7243b62" (UID: "597bee6b-c39b-402a-bdce-6d8ac7243b62"). InnerVolumeSpecName "kube-api-access-qjzfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.470678 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjzfn\" (UniqueName: \"kubernetes.io/projected/597bee6b-c39b-402a-bdce-6d8ac7243b62-kube-api-access-qjzfn\") on node \"crc\" DevicePath \"\"" Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.572659 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/597bee6b-c39b-402a-bdce-6d8ac7243b62-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "597bee6b-c39b-402a-bdce-6d8ac7243b62" (UID: "597bee6b-c39b-402a-bdce-6d8ac7243b62"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:07:01 crc kubenswrapper[4956]: I0930 07:07:01.573167 4956 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/597bee6b-c39b-402a-bdce-6d8ac7243b62-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 07:07:02 crc kubenswrapper[4956]: I0930 07:07:02.234538 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-24gnb/must-gather-lvqn6" Sep 30 07:07:02 crc kubenswrapper[4956]: I0930 07:07:02.350529 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="597bee6b-c39b-402a-bdce-6d8ac7243b62" path="/var/lib/kubelet/pods/597bee6b-c39b-402a-bdce-6d8ac7243b62/volumes" Sep 30 07:07:05 crc kubenswrapper[4956]: I0930 07:07:05.341672 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:07:05 crc kubenswrapper[4956]: E0930 07:07:05.342396 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:07:18 crc kubenswrapper[4956]: I0930 07:07:18.531179 4956 scope.go:117] "RemoveContainer" containerID="89f2b471941e4cb6939a5ff07ac14e8c107cd5a9d38ad4e56afb06e65a956f6a" Sep 30 07:07:18 crc kubenswrapper[4956]: I0930 07:07:18.569633 4956 scope.go:117] "RemoveContainer" containerID="9c5d5a2b0292cc93f8d7d891e62a97ac199ace2d19a3d4ecccb50601318f4159" Sep 30 07:07:19 crc kubenswrapper[4956]: I0930 07:07:19.341223 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:07:20 crc kubenswrapper[4956]: I0930 07:07:20.423501 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"1a72ba49612cd65b05f8abd486b9207dfc5c1a9a11fa2c0d5239532cc63509a0"} Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.532334 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mv6ft/must-gather-5ls46"] Sep 30 07:07:27 crc kubenswrapper[4956]: E0930 07:07:27.533654 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerName="gather" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.533678 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerName="gather" Sep 30 07:07:27 crc kubenswrapper[4956]: E0930 07:07:27.533710 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerName="copy" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.533725 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerName="copy" Sep 30 07:07:27 crc kubenswrapper[4956]: E0930 07:07:27.533742 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" containerName="extract-content" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.533754 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" containerName="extract-content" Sep 30 07:07:27 crc kubenswrapper[4956]: E0930 07:07:27.533805 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" containerName="registry-server" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.533817 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" containerName="registry-server" Sep 30 07:07:27 crc kubenswrapper[4956]: E0930 07:07:27.533856 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" containerName="extract-utilities" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.533868 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" containerName="extract-utilities" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.534286 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerName="copy" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.534319 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c5a711-65de-4a6a-b144-1c6710406fde" containerName="registry-server" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.534347 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="597bee6b-c39b-402a-bdce-6d8ac7243b62" containerName="gather" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.536423 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.538572 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mv6ft"/"default-dockercfg-5bxll" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.539329 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mv6ft"/"openshift-service-ca.crt" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.539946 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mv6ft"/"kube-root-ca.crt" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.564600 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv6ft/must-gather-5ls46"] Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.727520 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w84b\" (UniqueName: \"kubernetes.io/projected/c55bd359-a76e-48ab-8fa1-0c402351b7b0-kube-api-access-2w84b\") pod \"must-gather-5ls46\" (UID: \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\") " pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.727578 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c55bd359-a76e-48ab-8fa1-0c402351b7b0-must-gather-output\") pod \"must-gather-5ls46\" (UID: \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\") " pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.830678 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w84b\" (UniqueName: \"kubernetes.io/projected/c55bd359-a76e-48ab-8fa1-0c402351b7b0-kube-api-access-2w84b\") pod \"must-gather-5ls46\" (UID: \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\") " pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.830724 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c55bd359-a76e-48ab-8fa1-0c402351b7b0-must-gather-output\") pod \"must-gather-5ls46\" (UID: \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\") " pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.831448 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c55bd359-a76e-48ab-8fa1-0c402351b7b0-must-gather-output\") pod \"must-gather-5ls46\" (UID: \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\") " pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.849492 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w84b\" (UniqueName: \"kubernetes.io/projected/c55bd359-a76e-48ab-8fa1-0c402351b7b0-kube-api-access-2w84b\") pod \"must-gather-5ls46\" (UID: \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\") " pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:07:27 crc kubenswrapper[4956]: I0930 07:07:27.861642 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:07:28 crc kubenswrapper[4956]: I0930 07:07:28.302834 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv6ft/must-gather-5ls46"] Sep 30 07:07:28 crc kubenswrapper[4956]: I0930 07:07:28.510511 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/must-gather-5ls46" event={"ID":"c55bd359-a76e-48ab-8fa1-0c402351b7b0","Type":"ContainerStarted","Data":"70c505ecdaca6ad09f8327c9ada29a647c0b9d023a7b6c11e238676d571c93e9"} Sep 30 07:07:29 crc kubenswrapper[4956]: I0930 07:07:29.524878 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/must-gather-5ls46" event={"ID":"c55bd359-a76e-48ab-8fa1-0c402351b7b0","Type":"ContainerStarted","Data":"d8a7ef2d7a5bcdc7f5f4ede90ea59ded5178e1f0e05d9d1ca381cbec94519b07"} Sep 30 07:07:29 crc kubenswrapper[4956]: I0930 07:07:29.525180 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/must-gather-5ls46" event={"ID":"c55bd359-a76e-48ab-8fa1-0c402351b7b0","Type":"ContainerStarted","Data":"41edb3d7cbfe7b5413100d95ab4faf2a04343c542b8c5d46c6cf2f208fb374ff"} Sep 30 07:07:29 crc kubenswrapper[4956]: I0930 07:07:29.548472 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mv6ft/must-gather-5ls46" podStartSLOduration=2.548450309 podStartE2EDuration="2.548450309s" podCreationTimestamp="2025-09-30 07:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:07:29.541303275 +0000 UTC m=+5919.868423790" watchObservedRunningTime="2025-09-30 07:07:29.548450309 +0000 UTC m=+5919.875570834" Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.126327 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-rvrzr"] Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.127902 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.314597 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/838e378a-efc3-46ec-a934-9eb4d8d5475e-host\") pod \"crc-debug-rvrzr\" (UID: \"838e378a-efc3-46ec-a934-9eb4d8d5475e\") " pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.314666 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwd8n\" (UniqueName: \"kubernetes.io/projected/838e378a-efc3-46ec-a934-9eb4d8d5475e-kube-api-access-hwd8n\") pod \"crc-debug-rvrzr\" (UID: \"838e378a-efc3-46ec-a934-9eb4d8d5475e\") " pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.416386 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/838e378a-efc3-46ec-a934-9eb4d8d5475e-host\") pod \"crc-debug-rvrzr\" (UID: \"838e378a-efc3-46ec-a934-9eb4d8d5475e\") " pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.416438 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwd8n\" (UniqueName: \"kubernetes.io/projected/838e378a-efc3-46ec-a934-9eb4d8d5475e-kube-api-access-hwd8n\") pod \"crc-debug-rvrzr\" (UID: \"838e378a-efc3-46ec-a934-9eb4d8d5475e\") " pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.416484 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/838e378a-efc3-46ec-a934-9eb4d8d5475e-host\") pod \"crc-debug-rvrzr\" (UID: \"838e378a-efc3-46ec-a934-9eb4d8d5475e\") " pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.436076 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwd8n\" (UniqueName: \"kubernetes.io/projected/838e378a-efc3-46ec-a934-9eb4d8d5475e-kube-api-access-hwd8n\") pod \"crc-debug-rvrzr\" (UID: \"838e378a-efc3-46ec-a934-9eb4d8d5475e\") " pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.449029 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:07:32 crc kubenswrapper[4956]: W0930 07:07:32.495205 4956 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod838e378a_efc3_46ec_a934_9eb4d8d5475e.slice/crio-18c39517eb63dd3547be1da704ed9d220483f807c74be942b5642e11bb327743 WatchSource:0}: Error finding container 18c39517eb63dd3547be1da704ed9d220483f807c74be942b5642e11bb327743: Status 404 returned error can't find the container with id 18c39517eb63dd3547be1da704ed9d220483f807c74be942b5642e11bb327743 Sep 30 07:07:32 crc kubenswrapper[4956]: I0930 07:07:32.554139 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" event={"ID":"838e378a-efc3-46ec-a934-9eb4d8d5475e","Type":"ContainerStarted","Data":"18c39517eb63dd3547be1da704ed9d220483f807c74be942b5642e11bb327743"} Sep 30 07:07:33 crc kubenswrapper[4956]: I0930 07:07:33.563605 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" event={"ID":"838e378a-efc3-46ec-a934-9eb4d8d5475e","Type":"ContainerStarted","Data":"2d3769d08a7beffd9887fbf725d961df7ba2e863d70a22f2dad5d61d379d7af5"} Sep 30 07:07:33 crc kubenswrapper[4956]: I0930 07:07:33.581429 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" podStartSLOduration=1.5813886579999998 podStartE2EDuration="1.581388658s" podCreationTimestamp="2025-09-30 07:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:07:33.578188498 +0000 UTC m=+5923.905309023" watchObservedRunningTime="2025-09-30 07:07:33.581388658 +0000 UTC m=+5923.908509183" Sep 30 07:08:17 crc kubenswrapper[4956]: I0930 07:08:17.972108 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfn5"] Sep 30 07:08:17 crc kubenswrapper[4956]: I0930 07:08:17.976423 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:17 crc kubenswrapper[4956]: I0930 07:08:17.986553 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfn5"] Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.068127 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-utilities\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.068475 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-catalog-content\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.068740 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mcz9\" (UniqueName: \"kubernetes.io/projected/f7942a40-a199-4f55-a40c-8783d8f549a4-kube-api-access-2mcz9\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.170401 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mcz9\" (UniqueName: \"kubernetes.io/projected/f7942a40-a199-4f55-a40c-8783d8f549a4-kube-api-access-2mcz9\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.170778 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-utilities\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.170943 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-catalog-content\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.171340 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-utilities\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.171523 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-catalog-content\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.192894 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mcz9\" (UniqueName: \"kubernetes.io/projected/f7942a40-a199-4f55-a40c-8783d8f549a4-kube-api-access-2mcz9\") pod \"redhat-marketplace-mkfn5\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.306340 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:18 crc kubenswrapper[4956]: I0930 07:08:18.839459 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfn5"] Sep 30 07:08:19 crc kubenswrapper[4956]: I0930 07:08:19.002019 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfn5" event={"ID":"f7942a40-a199-4f55-a40c-8783d8f549a4","Type":"ContainerStarted","Data":"e82bc3f95af3df3fcf5ff0790e7be25707b71504773239265faed058bdedea1f"} Sep 30 07:08:20 crc kubenswrapper[4956]: I0930 07:08:20.011729 4956 generic.go:334] "Generic (PLEG): container finished" podID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerID="3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3" exitCode=0 Sep 30 07:08:20 crc kubenswrapper[4956]: I0930 07:08:20.011929 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfn5" event={"ID":"f7942a40-a199-4f55-a40c-8783d8f549a4","Type":"ContainerDied","Data":"3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3"} Sep 30 07:08:21 crc kubenswrapper[4956]: I0930 07:08:21.024091 4956 generic.go:334] "Generic (PLEG): container finished" podID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerID="237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40" exitCode=0 Sep 30 07:08:21 crc kubenswrapper[4956]: I0930 07:08:21.024227 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfn5" event={"ID":"f7942a40-a199-4f55-a40c-8783d8f549a4","Type":"ContainerDied","Data":"237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40"} Sep 30 07:08:22 crc kubenswrapper[4956]: I0930 07:08:22.041594 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfn5" event={"ID":"f7942a40-a199-4f55-a40c-8783d8f549a4","Type":"ContainerStarted","Data":"de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0"} Sep 30 07:08:22 crc kubenswrapper[4956]: I0930 07:08:22.068599 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mkfn5" podStartSLOduration=3.6457835579999998 podStartE2EDuration="5.068583028s" podCreationTimestamp="2025-09-30 07:08:17 +0000 UTC" firstStartedPulling="2025-09-30 07:08:20.013936211 +0000 UTC m=+5970.341056736" lastFinishedPulling="2025-09-30 07:08:21.436735681 +0000 UTC m=+5971.763856206" observedRunningTime="2025-09-30 07:08:22.058336396 +0000 UTC m=+5972.385456941" watchObservedRunningTime="2025-09-30 07:08:22.068583028 +0000 UTC m=+5972.395703553" Sep 30 07:08:28 crc kubenswrapper[4956]: I0930 07:08:28.306579 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:28 crc kubenswrapper[4956]: I0930 07:08:28.307161 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:28 crc kubenswrapper[4956]: I0930 07:08:28.367796 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:29 crc kubenswrapper[4956]: I0930 07:08:29.167286 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:29 crc kubenswrapper[4956]: I0930 07:08:29.206828 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfn5"] Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.158251 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mkfn5" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerName="registry-server" containerID="cri-o://de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0" gracePeriod=2 Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.691041 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.782163 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-utilities\") pod \"f7942a40-a199-4f55-a40c-8783d8f549a4\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.782231 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-catalog-content\") pod \"f7942a40-a199-4f55-a40c-8783d8f549a4\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.782420 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mcz9\" (UniqueName: \"kubernetes.io/projected/f7942a40-a199-4f55-a40c-8783d8f549a4-kube-api-access-2mcz9\") pod \"f7942a40-a199-4f55-a40c-8783d8f549a4\" (UID: \"f7942a40-a199-4f55-a40c-8783d8f549a4\") " Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.784002 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-utilities" (OuterVolumeSpecName: "utilities") pod "f7942a40-a199-4f55-a40c-8783d8f549a4" (UID: "f7942a40-a199-4f55-a40c-8783d8f549a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.796989 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7942a40-a199-4f55-a40c-8783d8f549a4" (UID: "f7942a40-a199-4f55-a40c-8783d8f549a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.803037 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7942a40-a199-4f55-a40c-8783d8f549a4-kube-api-access-2mcz9" (OuterVolumeSpecName: "kube-api-access-2mcz9") pod "f7942a40-a199-4f55-a40c-8783d8f549a4" (UID: "f7942a40-a199-4f55-a40c-8783d8f549a4"). InnerVolumeSpecName "kube-api-access-2mcz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.885181 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mcz9\" (UniqueName: \"kubernetes.io/projected/f7942a40-a199-4f55-a40c-8783d8f549a4-kube-api-access-2mcz9\") on node \"crc\" DevicePath \"\"" Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.885213 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:08:31 crc kubenswrapper[4956]: I0930 07:08:31.885224 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7942a40-a199-4f55-a40c-8783d8f549a4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.170453 4956 generic.go:334] "Generic (PLEG): container finished" podID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerID="de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0" exitCode=0 Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.170494 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfn5" event={"ID":"f7942a40-a199-4f55-a40c-8783d8f549a4","Type":"ContainerDied","Data":"de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0"} Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.170523 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkfn5" event={"ID":"f7942a40-a199-4f55-a40c-8783d8f549a4","Type":"ContainerDied","Data":"e82bc3f95af3df3fcf5ff0790e7be25707b71504773239265faed058bdedea1f"} Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.170528 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkfn5" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.170539 4956 scope.go:117] "RemoveContainer" containerID="de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.198452 4956 scope.go:117] "RemoveContainer" containerID="237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.205589 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfn5"] Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.219040 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkfn5"] Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.241172 4956 scope.go:117] "RemoveContainer" containerID="3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.286495 4956 scope.go:117] "RemoveContainer" containerID="de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0" Sep 30 07:08:32 crc kubenswrapper[4956]: E0930 07:08:32.287567 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0\": container with ID starting with de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0 not found: ID does not exist" containerID="de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.287618 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0"} err="failed to get container status \"de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0\": rpc error: code = NotFound desc = could not find container \"de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0\": container with ID starting with de3107a78ea8e04cc8eca6af93b59718009bc3d5d5f8c936de48c1d3f0130ac0 not found: ID does not exist" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.287649 4956 scope.go:117] "RemoveContainer" containerID="237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40" Sep 30 07:08:32 crc kubenswrapper[4956]: E0930 07:08:32.287967 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40\": container with ID starting with 237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40 not found: ID does not exist" containerID="237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.287999 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40"} err="failed to get container status \"237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40\": rpc error: code = NotFound desc = could not find container \"237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40\": container with ID starting with 237ce3a284e77733bfbefa422fd39718ad21dca7f007ece189aac29095e98e40 not found: ID does not exist" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.288017 4956 scope.go:117] "RemoveContainer" containerID="3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3" Sep 30 07:08:32 crc kubenswrapper[4956]: E0930 07:08:32.288337 4956 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3\": container with ID starting with 3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3 not found: ID does not exist" containerID="3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.288368 4956 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3"} err="failed to get container status \"3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3\": rpc error: code = NotFound desc = could not find container \"3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3\": container with ID starting with 3b6b25fedf6dd3b2d7099297bf1fee5d7d8b696c102dd5f808d12c291d2c99f3 not found: ID does not exist" Sep 30 07:08:32 crc kubenswrapper[4956]: I0930 07:08:32.356006 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" path="/var/lib/kubelet/pods/f7942a40-a199-4f55-a40c-8783d8f549a4/volumes" Sep 30 07:08:51 crc kubenswrapper[4956]: I0930 07:08:51.570248 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-646f99fb9d-9lthp_45e3d2ce-91f3-420a-b8a2-ebeb4c113565/barbican-api/0.log" Sep 30 07:08:51 crc kubenswrapper[4956]: I0930 07:08:51.658647 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-646f99fb9d-9lthp_45e3d2ce-91f3-420a-b8a2-ebeb4c113565/barbican-api-log/0.log" Sep 30 07:08:51 crc kubenswrapper[4956]: I0930 07:08:51.798425 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-96cd79c6-2b9nr_bc0eca45-f776-4d77-8589-a2605f824696/barbican-keystone-listener/0.log" Sep 30 07:08:51 crc kubenswrapper[4956]: I0930 07:08:51.902235 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-96cd79c6-2b9nr_bc0eca45-f776-4d77-8589-a2605f824696/barbican-keystone-listener-log/0.log" Sep 30 07:08:52 crc kubenswrapper[4956]: I0930 07:08:52.265930 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56bb449d8f-bh9xp_6c9006da-cf02-4bed-8247-02b6e929ff98/barbican-worker-log/0.log" Sep 30 07:08:52 crc kubenswrapper[4956]: I0930 07:08:52.283301 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56bb449d8f-bh9xp_6c9006da-cf02-4bed-8247-02b6e929ff98/barbican-worker/0.log" Sep 30 07:08:52 crc kubenswrapper[4956]: I0930 07:08:52.511523 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jxbkv_dc55cabb-5e01-4012-a02f-27ee023df0c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:52 crc kubenswrapper[4956]: I0930 07:08:52.721781 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c3e1cc75-7672-44f7-acb3-69ef2ae910d1/ceilometer-central-agent/0.log" Sep 30 07:08:52 crc kubenswrapper[4956]: I0930 07:08:52.773623 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c3e1cc75-7672-44f7-acb3-69ef2ae910d1/proxy-httpd/0.log" Sep 30 07:08:52 crc kubenswrapper[4956]: I0930 07:08:52.773919 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c3e1cc75-7672-44f7-acb3-69ef2ae910d1/ceilometer-notification-agent/0.log" Sep 30 07:08:52 crc kubenswrapper[4956]: I0930 07:08:52.904501 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c3e1cc75-7672-44f7-acb3-69ef2ae910d1/sg-core/0.log" Sep 30 07:08:53 crc kubenswrapper[4956]: I0930 07:08:53.156226 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b0e586a-4f48-4d87-9ecb-732f5723e089/cinder-api-log/0.log" Sep 30 07:08:53 crc kubenswrapper[4956]: I0930 07:08:53.375100 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b0e586a-4f48-4d87-9ecb-732f5723e089/cinder-api/0.log" Sep 30 07:08:53 crc kubenswrapper[4956]: I0930 07:08:53.413935 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6a945d1b-fce5-4069-8fb9-6c483a712cd2/cinder-scheduler/0.log" Sep 30 07:08:53 crc kubenswrapper[4956]: I0930 07:08:53.524545 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6a945d1b-fce5-4069-8fb9-6c483a712cd2/probe/0.log" Sep 30 07:08:53 crc kubenswrapper[4956]: I0930 07:08:53.658615 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ps2jm_7577cb07-1bcb-4432-a6e8-f57c1f1b2421/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:53 crc kubenswrapper[4956]: I0930 07:08:53.846731 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5mr6d_ec470ec9-3b1f-409d-ab54-44bb44daf1fe/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:53 crc kubenswrapper[4956]: I0930 07:08:53.986409 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-dnptc_c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee/init/0.log" Sep 30 07:08:54 crc kubenswrapper[4956]: I0930 07:08:54.238717 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-dnptc_c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee/init/0.log" Sep 30 07:08:54 crc kubenswrapper[4956]: I0930 07:08:54.362076 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-dnptc_c38cc2b9-0cab-4a36-8d6e-070fbb2b6bee/dnsmasq-dns/0.log" Sep 30 07:08:54 crc kubenswrapper[4956]: I0930 07:08:54.493860 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-sdxn6_6f2a0586-6940-498b-8eaa-1f16bf0ea2fa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:54 crc kubenswrapper[4956]: I0930 07:08:54.618280 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a834394a-7f87-43b1-aebb-e61b5916077c/glance-httpd/0.log" Sep 30 07:08:54 crc kubenswrapper[4956]: I0930 07:08:54.676670 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a834394a-7f87-43b1-aebb-e61b5916077c/glance-log/0.log" Sep 30 07:08:54 crc kubenswrapper[4956]: I0930 07:08:54.802178 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d68b4b8-0213-4951-bf44-a8f7c6a1677c/glance-log/0.log" Sep 30 07:08:54 crc kubenswrapper[4956]: I0930 07:08:54.828093 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d68b4b8-0213-4951-bf44-a8f7c6a1677c/glance-httpd/0.log" Sep 30 07:08:55 crc kubenswrapper[4956]: I0930 07:08:55.073799 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5f96b888bb-bhtl9_f29ac7f2-13b9-47d8-9218-fb08840e6704/horizon/0.log" Sep 30 07:08:55 crc kubenswrapper[4956]: I0930 07:08:55.195403 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ffdqp_278c7daa-7016-4bde-8424-bd0c3491cb3c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:55 crc kubenswrapper[4956]: I0930 07:08:55.370281 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tnwj7_cc76e3e2-ccde-4622-bcd2-fc347ee16271/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:55 crc kubenswrapper[4956]: I0930 07:08:55.780876 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320201-dhqp5_356a925e-f5c8-48b3-b62c-5c80f7566d01/keystone-cron/0.log" Sep 30 07:08:55 crc kubenswrapper[4956]: I0930 07:08:55.854821 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5f96b888bb-bhtl9_f29ac7f2-13b9-47d8-9218-fb08840e6704/horizon-log/0.log" Sep 30 07:08:55 crc kubenswrapper[4956]: I0930 07:08:55.957489 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320261-p55tp_f9787291-2195-4229-910e-88f41de812c8/keystone-cron/0.log" Sep 30 07:08:56 crc kubenswrapper[4956]: I0930 07:08:56.137052 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6798cf9d78-m426q_1997765b-9597-4f93-a11a-8df4f572dee4/keystone-api/0.log" Sep 30 07:08:56 crc kubenswrapper[4956]: I0930 07:08:56.161554 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1b53d536-4a4f-463d-bae5-360555cd4583/kube-state-metrics/0.log" Sep 30 07:08:56 crc kubenswrapper[4956]: I0930 07:08:56.300907 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-scntk_16fa92a8-7fcd-45bd-9b5a-f77149ec71f4/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:56 crc kubenswrapper[4956]: I0930 07:08:56.805444 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cbdcdc45c-b9667_0a1e613a-d4fe-4779-97f9-a931d68f083c/neutron-httpd/0.log" Sep 30 07:08:56 crc kubenswrapper[4956]: I0930 07:08:56.823454 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cbdcdc45c-b9667_0a1e613a-d4fe-4779-97f9-a931d68f083c/neutron-api/0.log" Sep 30 07:08:56 crc kubenswrapper[4956]: I0930 07:08:56.923536 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-g547c_f99ede59-582c-4ed5-99e4-6ba65d66aedb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:57 crc kubenswrapper[4956]: I0930 07:08:57.930413 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c3275745-4918-4509-ab41-ad6e6653bcc8/nova-cell0-conductor-conductor/0.log" Sep 30 07:08:58 crc kubenswrapper[4956]: I0930 07:08:58.589178 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6a7e115a-efb9-4ea4-aed6-efa6d4b80203/nova-api-log/0.log" Sep 30 07:08:58 crc kubenswrapper[4956]: I0930 07:08:58.596433 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0515ec62-950a-4ab8-8462-7030b37609db/nova-cell1-conductor-conductor/0.log" Sep 30 07:08:58 crc kubenswrapper[4956]: I0930 07:08:58.953819 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6a7e115a-efb9-4ea4-aed6-efa6d4b80203/nova-api-api/0.log" Sep 30 07:08:58 crc kubenswrapper[4956]: I0930 07:08:58.971177 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5b82f755-357a-4b4e-84bd-1712077f17a5/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 07:08:59 crc kubenswrapper[4956]: I0930 07:08:59.248253 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-n4vfr_8c77505e-cdca-4f43-a276-2102b2c33a58/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:08:59 crc kubenswrapper[4956]: I0930 07:08:59.399786 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0effab61-b755-4bd2-afb3-71cdf7983dc3/nova-metadata-log/0.log" Sep 30 07:08:59 crc kubenswrapper[4956]: I0930 07:08:59.972151 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e95031-1e0d-4979-9926-ba52d0208646/mysql-bootstrap/0.log" Sep 30 07:09:00 crc kubenswrapper[4956]: I0930 07:09:00.008412 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_94c3b340-db8b-4ec8-8554-556d914309a2/nova-scheduler-scheduler/0.log" Sep 30 07:09:00 crc kubenswrapper[4956]: I0930 07:09:00.158686 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e95031-1e0d-4979-9926-ba52d0208646/mysql-bootstrap/0.log" Sep 30 07:09:00 crc kubenswrapper[4956]: I0930 07:09:00.248167 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e95031-1e0d-4979-9926-ba52d0208646/galera/0.log" Sep 30 07:09:00 crc kubenswrapper[4956]: I0930 07:09:00.509540 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_541322a8-d098-4331-ab5b-500262d4655c/mysql-bootstrap/0.log" Sep 30 07:09:00 crc kubenswrapper[4956]: I0930 07:09:00.766593 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_541322a8-d098-4331-ab5b-500262d4655c/galera/0.log" Sep 30 07:09:00 crc kubenswrapper[4956]: I0930 07:09:00.780973 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_541322a8-d098-4331-ab5b-500262d4655c/mysql-bootstrap/0.log" Sep 30 07:09:01 crc kubenswrapper[4956]: I0930 07:09:01.004703 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f596b95b-7b2b-4d7b-8f33-9eb214a39a21/openstackclient/0.log" Sep 30 07:09:01 crc kubenswrapper[4956]: I0930 07:09:01.198514 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hfrcl_351735b0-6497-4a6c-9562-1ad2785ead5f/openstack-network-exporter/0.log" Sep 30 07:09:01 crc kubenswrapper[4956]: I0930 07:09:01.549602 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9lvd_f61a8fc3-4802-4dfb-b17e-fd4b8db1b863/ovsdb-server-init/0.log" Sep 30 07:09:01 crc kubenswrapper[4956]: I0930 07:09:01.686417 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9lvd_f61a8fc3-4802-4dfb-b17e-fd4b8db1b863/ovsdb-server-init/0.log" Sep 30 07:09:01 crc kubenswrapper[4956]: I0930 07:09:01.884788 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9lvd_f61a8fc3-4802-4dfb-b17e-fd4b8db1b863/ovsdb-server/0.log" Sep 30 07:09:01 crc kubenswrapper[4956]: I0930 07:09:01.976616 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0effab61-b755-4bd2-afb3-71cdf7983dc3/nova-metadata-metadata/0.log" Sep 30 07:09:02 crc kubenswrapper[4956]: I0930 07:09:02.103777 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-p9lvd_f61a8fc3-4802-4dfb-b17e-fd4b8db1b863/ovs-vswitchd/0.log" Sep 30 07:09:02 crc kubenswrapper[4956]: I0930 07:09:02.445882 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-r7wfs_611676cd-11d2-44c4-bae2-41b6b22f898d/ovn-controller/0.log" Sep 30 07:09:02 crc kubenswrapper[4956]: I0930 07:09:02.599431 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fb24p_a6c38cbf-b86e-473b-8f78-b917dc31d239/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:09:02 crc kubenswrapper[4956]: I0930 07:09:02.833574 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c0fedcc9-8df5-495f-adb8-a42a2a811c49/openstack-network-exporter/0.log" Sep 30 07:09:02 crc kubenswrapper[4956]: I0930 07:09:02.834575 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c0fedcc9-8df5-495f-adb8-a42a2a811c49/ovn-northd/0.log" Sep 30 07:09:03 crc kubenswrapper[4956]: I0930 07:09:03.338224 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_82458356-9089-4c9d-a672-746eb618af3d/openstack-network-exporter/0.log" Sep 30 07:09:03 crc kubenswrapper[4956]: I0930 07:09:03.358284 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_82458356-9089-4c9d-a672-746eb618af3d/ovsdbserver-nb/0.log" Sep 30 07:09:03 crc kubenswrapper[4956]: I0930 07:09:03.577543 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd6f01ba-ec4b-4b98-8a33-e029d86b258b/openstack-network-exporter/0.log" Sep 30 07:09:03 crc kubenswrapper[4956]: I0930 07:09:03.613841 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd6f01ba-ec4b-4b98-8a33-e029d86b258b/ovsdbserver-sb/0.log" Sep 30 07:09:04 crc kubenswrapper[4956]: I0930 07:09:04.117988 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-654d9b45dd-f9lqj_17e5ec46-ef77-490f-b564-b3d4426dd9c8/placement-api/0.log" Sep 30 07:09:04 crc kubenswrapper[4956]: I0930 07:09:04.221196 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-654d9b45dd-f9lqj_17e5ec46-ef77-490f-b564-b3d4426dd9c8/placement-log/0.log" Sep 30 07:09:04 crc kubenswrapper[4956]: I0930 07:09:04.394850 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/init-config-reloader/0.log" Sep 30 07:09:04 crc kubenswrapper[4956]: I0930 07:09:04.670543 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/init-config-reloader/0.log" Sep 30 07:09:04 crc kubenswrapper[4956]: I0930 07:09:04.671588 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/prometheus/0.log" Sep 30 07:09:04 crc kubenswrapper[4956]: I0930 07:09:04.752449 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/config-reloader/0.log" Sep 30 07:09:05 crc kubenswrapper[4956]: I0930 07:09:05.263660 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a47b02d9-bc18-4fa1-a0a9-1918de176de9/setup-container/0.log" Sep 30 07:09:05 crc kubenswrapper[4956]: I0930 07:09:05.297877 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_949e372a-c9a8-4db6-a275-512d0236dd01/thanos-sidecar/0.log" Sep 30 07:09:05 crc kubenswrapper[4956]: I0930 07:09:05.593509 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a47b02d9-bc18-4fa1-a0a9-1918de176de9/setup-container/0.log" Sep 30 07:09:05 crc kubenswrapper[4956]: I0930 07:09:05.602626 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a47b02d9-bc18-4fa1-a0a9-1918de176de9/rabbitmq/0.log" Sep 30 07:09:05 crc kubenswrapper[4956]: I0930 07:09:05.775537 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_de3a8c94-71b5-4948-9079-cc7009b9a8ea/setup-container/0.log" Sep 30 07:09:06 crc kubenswrapper[4956]: I0930 07:09:06.080377 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_de3a8c94-71b5-4948-9079-cc7009b9a8ea/setup-container/0.log" Sep 30 07:09:06 crc kubenswrapper[4956]: I0930 07:09:06.127511 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_de3a8c94-71b5-4948-9079-cc7009b9a8ea/rabbitmq/0.log" Sep 30 07:09:06 crc kubenswrapper[4956]: I0930 07:09:06.504153 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ac3f1eed-d128-4008-bbe5-0f319495ef52/setup-container/0.log" Sep 30 07:09:06 crc kubenswrapper[4956]: I0930 07:09:06.667095 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ac3f1eed-d128-4008-bbe5-0f319495ef52/setup-container/0.log" Sep 30 07:09:06 crc kubenswrapper[4956]: I0930 07:09:06.765323 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ac3f1eed-d128-4008-bbe5-0f319495ef52/rabbitmq/0.log" Sep 30 07:09:06 crc kubenswrapper[4956]: I0930 07:09:06.941992 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-bwcjk_17e89a19-be9f-454b-b1c9-8f5b813a9a3b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:09:07 crc kubenswrapper[4956]: I0930 07:09:07.025192 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-6c6wt_70901491-8063-4429-baee-c1a295960e2c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:09:07 crc kubenswrapper[4956]: I0930 07:09:07.278044 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-fc627_2b60bb30-87ca-43de-a737-1a0fc105197e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:09:07 crc kubenswrapper[4956]: I0930 07:09:07.385535 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-785q7_fd6826cb-1434-4917-a836-ae952394b1ca/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:09:07 crc kubenswrapper[4956]: I0930 07:09:07.496939 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-d8mjv_9ad04c2f-1706-4b52-9267-613f86dc0388/ssh-known-hosts-edpm-deployment/0.log" Sep 30 07:09:07 crc kubenswrapper[4956]: I0930 07:09:07.794401 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-779b5888b9-9hp77_62113d6a-1e88-402d-b6bd-4f119a6df416/proxy-server/0.log" Sep 30 07:09:07 crc kubenswrapper[4956]: I0930 07:09:07.949679 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-779b5888b9-9hp77_62113d6a-1e88-402d-b6bd-4f119a6df416/proxy-httpd/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.045458 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2zwhw_547b221a-e3b1-4a31-b09c-07022356f1e9/swift-ring-rebalance/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.204303 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/account-auditor/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.324133 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/account-reaper/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.546596 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/account-replicator/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.594552 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/account-server/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.609224 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/container-auditor/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.841323 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/container-replicator/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.862731 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/container-updater/0.log" Sep 30 07:09:08 crc kubenswrapper[4956]: I0930 07:09:08.929703 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/container-server/0.log" Sep 30 07:09:09 crc kubenswrapper[4956]: I0930 07:09:09.115196 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-expirer/0.log" Sep 30 07:09:09 crc kubenswrapper[4956]: I0930 07:09:09.137142 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-auditor/0.log" Sep 30 07:09:09 crc kubenswrapper[4956]: I0930 07:09:09.213248 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-replicator/0.log" Sep 30 07:09:09 crc kubenswrapper[4956]: I0930 07:09:09.352396 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-server/0.log" Sep 30 07:09:09 crc kubenswrapper[4956]: I0930 07:09:09.436407 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/object-updater/0.log" Sep 30 07:09:09 crc kubenswrapper[4956]: I0930 07:09:09.464341 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/rsync/0.log" Sep 30 07:09:09 crc kubenswrapper[4956]: I0930 07:09:09.895736 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_38f02895-66c4-4da2-b408-838646d7ecbd/swift-recon-cron/0.log" Sep 30 07:09:10 crc kubenswrapper[4956]: I0930 07:09:10.017885 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nhd6c_67054ea3-1f2b-43dd-ada6-8908e9f7c2de/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:09:10 crc kubenswrapper[4956]: I0930 07:09:10.298381 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c822eb6a-ddf6-44d5-8a3c-35408a3a0f69/tempest-tests-tempest-tests-runner/0.log" Sep 30 07:09:10 crc kubenswrapper[4956]: I0930 07:09:10.415698 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0b31d8d5-0881-4a71-b953-2e70288e3190/test-operator-logs-container/0.log" Sep 30 07:09:10 crc kubenswrapper[4956]: I0930 07:09:10.558337 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-4j92f_8861310e-59f9-47fc-9224-fc01da1aab28/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:09:11 crc kubenswrapper[4956]: I0930 07:09:11.882127 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_48193dc0-3638-40e3-8b54-2b2049bd5925/watcher-applier/0.log" Sep 30 07:09:12 crc kubenswrapper[4956]: I0930 07:09:12.158456 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5268a4a4-b72f-47e6-a485-04dcc5935087/watcher-api-log/0.log" Sep 30 07:09:12 crc kubenswrapper[4956]: I0930 07:09:12.422839 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_aad83b1a-8ea2-4f43-a6e3-b8e844a65115/watcher-decision-engine/2.log" Sep 30 07:09:16 crc kubenswrapper[4956]: I0930 07:09:16.671539 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_aad83b1a-8ea2-4f43-a6e3-b8e844a65115/watcher-decision-engine/3.log" Sep 30 07:09:17 crc kubenswrapper[4956]: I0930 07:09:17.547537 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5268a4a4-b72f-47e6-a485-04dcc5935087/watcher-api/0.log" Sep 30 07:09:18 crc kubenswrapper[4956]: I0930 07:09:18.699187 4956 scope.go:117] "RemoveContainer" containerID="c43ae9940271b9d78188c0b04f0a2c933a0cccdc944130e7224cfc5a624aaad2" Sep 30 07:09:25 crc kubenswrapper[4956]: I0930 07:09:25.274553 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b5126f1e-65cb-4032-9df3-8cd061c43253/memcached/0.log" Sep 30 07:09:40 crc kubenswrapper[4956]: I0930 07:09:40.923672 4956 generic.go:334] "Generic (PLEG): container finished" podID="838e378a-efc3-46ec-a934-9eb4d8d5475e" containerID="2d3769d08a7beffd9887fbf725d961df7ba2e863d70a22f2dad5d61d379d7af5" exitCode=0 Sep 30 07:09:40 crc kubenswrapper[4956]: I0930 07:09:40.923855 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" event={"ID":"838e378a-efc3-46ec-a934-9eb4d8d5475e","Type":"ContainerDied","Data":"2d3769d08a7beffd9887fbf725d961df7ba2e863d70a22f2dad5d61d379d7af5"} Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.033063 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.066751 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-rvrzr"] Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.074582 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-rvrzr"] Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.138974 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwd8n\" (UniqueName: \"kubernetes.io/projected/838e378a-efc3-46ec-a934-9eb4d8d5475e-kube-api-access-hwd8n\") pod \"838e378a-efc3-46ec-a934-9eb4d8d5475e\" (UID: \"838e378a-efc3-46ec-a934-9eb4d8d5475e\") " Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.139074 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/838e378a-efc3-46ec-a934-9eb4d8d5475e-host\") pod \"838e378a-efc3-46ec-a934-9eb4d8d5475e\" (UID: \"838e378a-efc3-46ec-a934-9eb4d8d5475e\") " Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.139234 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/838e378a-efc3-46ec-a934-9eb4d8d5475e-host" (OuterVolumeSpecName: "host") pod "838e378a-efc3-46ec-a934-9eb4d8d5475e" (UID: "838e378a-efc3-46ec-a934-9eb4d8d5475e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.139792 4956 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/838e378a-efc3-46ec-a934-9eb4d8d5475e-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.145340 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/838e378a-efc3-46ec-a934-9eb4d8d5475e-kube-api-access-hwd8n" (OuterVolumeSpecName: "kube-api-access-hwd8n") pod "838e378a-efc3-46ec-a934-9eb4d8d5475e" (UID: "838e378a-efc3-46ec-a934-9eb4d8d5475e"). InnerVolumeSpecName "kube-api-access-hwd8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.241262 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwd8n\" (UniqueName: \"kubernetes.io/projected/838e378a-efc3-46ec-a934-9eb4d8d5475e-kube-api-access-hwd8n\") on node \"crc\" DevicePath \"\"" Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.351811 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="838e378a-efc3-46ec-a934-9eb4d8d5475e" path="/var/lib/kubelet/pods/838e378a-efc3-46ec-a934-9eb4d8d5475e/volumes" Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.945268 4956 scope.go:117] "RemoveContainer" containerID="2d3769d08a7beffd9887fbf725d961df7ba2e863d70a22f2dad5d61d379d7af5" Sep 30 07:09:42 crc kubenswrapper[4956]: I0930 07:09:42.945344 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-rvrzr" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.224063 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-jmkvx"] Sep 30 07:09:43 crc kubenswrapper[4956]: E0930 07:09:43.224483 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="838e378a-efc3-46ec-a934-9eb4d8d5475e" containerName="container-00" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.224495 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="838e378a-efc3-46ec-a934-9eb4d8d5475e" containerName="container-00" Sep 30 07:09:43 crc kubenswrapper[4956]: E0930 07:09:43.224521 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerName="extract-utilities" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.224527 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerName="extract-utilities" Sep 30 07:09:43 crc kubenswrapper[4956]: E0930 07:09:43.224542 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerName="registry-server" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.224549 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerName="registry-server" Sep 30 07:09:43 crc kubenswrapper[4956]: E0930 07:09:43.224560 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerName="extract-content" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.224565 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerName="extract-content" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.224776 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="838e378a-efc3-46ec-a934-9eb4d8d5475e" containerName="container-00" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.224787 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7942a40-a199-4f55-a40c-8783d8f549a4" containerName="registry-server" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.225463 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.258554 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5xm\" (UniqueName: \"kubernetes.io/projected/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-kube-api-access-7x5xm\") pod \"crc-debug-jmkvx\" (UID: \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\") " pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.258711 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-host\") pod \"crc-debug-jmkvx\" (UID: \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\") " pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.361160 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-host\") pod \"crc-debug-jmkvx\" (UID: \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\") " pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.361274 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-host\") pod \"crc-debug-jmkvx\" (UID: \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\") " pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.361461 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5xm\" (UniqueName: \"kubernetes.io/projected/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-kube-api-access-7x5xm\") pod \"crc-debug-jmkvx\" (UID: \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\") " pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.378012 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5xm\" (UniqueName: \"kubernetes.io/projected/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-kube-api-access-7x5xm\") pod \"crc-debug-jmkvx\" (UID: \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\") " pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.545823 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.958512 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" event={"ID":"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2","Type":"ContainerStarted","Data":"29e00ab6d21f5612fe5dc10242584dd15fd376808265ae5ddbe88614031a8f5f"} Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.958858 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" event={"ID":"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2","Type":"ContainerStarted","Data":"5b8d24250967ece3d3d5d996cc5afd0a5ecfe6e6a3d3cb177336dc67ef24ecc3"} Sep 30 07:09:43 crc kubenswrapper[4956]: I0930 07:09:43.988074 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" podStartSLOduration=0.988054486 podStartE2EDuration="988.054486ms" podCreationTimestamp="2025-09-30 07:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:09:43.973046786 +0000 UTC m=+6054.300167311" watchObservedRunningTime="2025-09-30 07:09:43.988054486 +0000 UTC m=+6054.315175021" Sep 30 07:09:44 crc kubenswrapper[4956]: I0930 07:09:44.967302 4956 generic.go:334] "Generic (PLEG): container finished" podID="4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2" containerID="29e00ab6d21f5612fe5dc10242584dd15fd376808265ae5ddbe88614031a8f5f" exitCode=0 Sep 30 07:09:44 crc kubenswrapper[4956]: I0930 07:09:44.968294 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" event={"ID":"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2","Type":"ContainerDied","Data":"29e00ab6d21f5612fe5dc10242584dd15fd376808265ae5ddbe88614031a8f5f"} Sep 30 07:09:46 crc kubenswrapper[4956]: I0930 07:09:46.117437 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:46 crc kubenswrapper[4956]: I0930 07:09:46.212329 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-host\") pod \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\" (UID: \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\") " Sep 30 07:09:46 crc kubenswrapper[4956]: I0930 07:09:46.212418 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-host" (OuterVolumeSpecName: "host") pod "4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2" (UID: "4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:09:46 crc kubenswrapper[4956]: I0930 07:09:46.212936 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x5xm\" (UniqueName: \"kubernetes.io/projected/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-kube-api-access-7x5xm\") pod \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\" (UID: \"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2\") " Sep 30 07:09:46 crc kubenswrapper[4956]: I0930 07:09:46.213645 4956 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:09:46 crc kubenswrapper[4956]: I0930 07:09:46.219593 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-kube-api-access-7x5xm" (OuterVolumeSpecName: "kube-api-access-7x5xm") pod "4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2" (UID: "4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2"). InnerVolumeSpecName "kube-api-access-7x5xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:09:46 crc kubenswrapper[4956]: I0930 07:09:46.314908 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x5xm\" (UniqueName: \"kubernetes.io/projected/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2-kube-api-access-7x5xm\") on node \"crc\" DevicePath \"\"" Sep 30 07:09:47 crc kubenswrapper[4956]: I0930 07:09:47.011403 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" Sep 30 07:09:47 crc kubenswrapper[4956]: I0930 07:09:47.011399 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-jmkvx" event={"ID":"4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2","Type":"ContainerDied","Data":"5b8d24250967ece3d3d5d996cc5afd0a5ecfe6e6a3d3cb177336dc67ef24ecc3"} Sep 30 07:09:47 crc kubenswrapper[4956]: I0930 07:09:47.011469 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b8d24250967ece3d3d5d996cc5afd0a5ecfe6e6a3d3cb177336dc67ef24ecc3" Sep 30 07:09:48 crc kubenswrapper[4956]: I0930 07:09:48.073268 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:09:48 crc kubenswrapper[4956]: I0930 07:09:48.073319 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:09:54 crc kubenswrapper[4956]: I0930 07:09:54.213207 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-jmkvx"] Sep 30 07:09:54 crc kubenswrapper[4956]: I0930 07:09:54.220910 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-jmkvx"] Sep 30 07:09:54 crc kubenswrapper[4956]: I0930 07:09:54.353352 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2" path="/var/lib/kubelet/pods/4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2/volumes" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.392224 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-llgfs"] Sep 30 07:09:55 crc kubenswrapper[4956]: E0930 07:09:55.392642 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2" containerName="container-00" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.392654 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2" containerName="container-00" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.392863 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2ee7f4-7d8b-4ca1-93b4-fdb4eaa6eeb2" containerName="container-00" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.393544 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.483448 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrd2m\" (UniqueName: \"kubernetes.io/projected/35ed2199-27ea-4280-8dad-f1a87a108724-kube-api-access-xrd2m\") pod \"crc-debug-llgfs\" (UID: \"35ed2199-27ea-4280-8dad-f1a87a108724\") " pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.483562 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35ed2199-27ea-4280-8dad-f1a87a108724-host\") pod \"crc-debug-llgfs\" (UID: \"35ed2199-27ea-4280-8dad-f1a87a108724\") " pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.585546 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrd2m\" (UniqueName: \"kubernetes.io/projected/35ed2199-27ea-4280-8dad-f1a87a108724-kube-api-access-xrd2m\") pod \"crc-debug-llgfs\" (UID: \"35ed2199-27ea-4280-8dad-f1a87a108724\") " pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.586013 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35ed2199-27ea-4280-8dad-f1a87a108724-host\") pod \"crc-debug-llgfs\" (UID: \"35ed2199-27ea-4280-8dad-f1a87a108724\") " pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.586142 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35ed2199-27ea-4280-8dad-f1a87a108724-host\") pod \"crc-debug-llgfs\" (UID: \"35ed2199-27ea-4280-8dad-f1a87a108724\") " pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.603291 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrd2m\" (UniqueName: \"kubernetes.io/projected/35ed2199-27ea-4280-8dad-f1a87a108724-kube-api-access-xrd2m\") pod \"crc-debug-llgfs\" (UID: \"35ed2199-27ea-4280-8dad-f1a87a108724\") " pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:55 crc kubenswrapper[4956]: I0930 07:09:55.711534 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:56 crc kubenswrapper[4956]: I0930 07:09:56.094964 4956 generic.go:334] "Generic (PLEG): container finished" podID="35ed2199-27ea-4280-8dad-f1a87a108724" containerID="e3b698f647cc39260428d4dd0264ab881b6d73e74115c7c9255c9d0d846c1166" exitCode=0 Sep 30 07:09:56 crc kubenswrapper[4956]: I0930 07:09:56.095064 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-llgfs" event={"ID":"35ed2199-27ea-4280-8dad-f1a87a108724","Type":"ContainerDied","Data":"e3b698f647cc39260428d4dd0264ab881b6d73e74115c7c9255c9d0d846c1166"} Sep 30 07:09:56 crc kubenswrapper[4956]: I0930 07:09:56.095307 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/crc-debug-llgfs" event={"ID":"35ed2199-27ea-4280-8dad-f1a87a108724","Type":"ContainerStarted","Data":"8f3888d715b71f51bcb01088cea70b8dc66cec1a672e1b51e26dbb1f8531d254"} Sep 30 07:09:56 crc kubenswrapper[4956]: I0930 07:09:56.129525 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-llgfs"] Sep 30 07:09:56 crc kubenswrapper[4956]: I0930 07:09:56.137778 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mv6ft/crc-debug-llgfs"] Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.219355 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.317327 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35ed2199-27ea-4280-8dad-f1a87a108724-host\") pod \"35ed2199-27ea-4280-8dad-f1a87a108724\" (UID: \"35ed2199-27ea-4280-8dad-f1a87a108724\") " Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.317453 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35ed2199-27ea-4280-8dad-f1a87a108724-host" (OuterVolumeSpecName: "host") pod "35ed2199-27ea-4280-8dad-f1a87a108724" (UID: "35ed2199-27ea-4280-8dad-f1a87a108724"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.317500 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrd2m\" (UniqueName: \"kubernetes.io/projected/35ed2199-27ea-4280-8dad-f1a87a108724-kube-api-access-xrd2m\") pod \"35ed2199-27ea-4280-8dad-f1a87a108724\" (UID: \"35ed2199-27ea-4280-8dad-f1a87a108724\") " Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.318041 4956 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35ed2199-27ea-4280-8dad-f1a87a108724-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.333970 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ed2199-27ea-4280-8dad-f1a87a108724-kube-api-access-xrd2m" (OuterVolumeSpecName: "kube-api-access-xrd2m") pod "35ed2199-27ea-4280-8dad-f1a87a108724" (UID: "35ed2199-27ea-4280-8dad-f1a87a108724"). InnerVolumeSpecName "kube-api-access-xrd2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.419856 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrd2m\" (UniqueName: \"kubernetes.io/projected/35ed2199-27ea-4280-8dad-f1a87a108724-kube-api-access-xrd2m\") on node \"crc\" DevicePath \"\"" Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.698407 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/util/0.log" Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.889222 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/pull/0.log" Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.894126 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/pull/0.log" Sep 30 07:09:57 crc kubenswrapper[4956]: I0930 07:09:57.942987 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/util/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.084074 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/extract/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.090376 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/pull/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.092454 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469729fn_7b070be8-a2e3-4b53-bfa6-502bc50daf72/util/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.133237 4956 scope.go:117] "RemoveContainer" containerID="e3b698f647cc39260428d4dd0264ab881b6d73e74115c7c9255c9d0d846c1166" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.133439 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/crc-debug-llgfs" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.267011 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-mrz5k_e8df7824-e9a8-4794-bb91-411ae6639639/kube-rbac-proxy/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.328361 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-hhv4p_3c1366b7-aa64-4089-a853-e2027658e237/kube-rbac-proxy/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.353644 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ed2199-27ea-4280-8dad-f1a87a108724" path="/var/lib/kubelet/pods/35ed2199-27ea-4280-8dad-f1a87a108724/volumes" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.362501 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-mrz5k_e8df7824-e9a8-4794-bb91-411ae6639639/manager/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.483572 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-hhv4p_3c1366b7-aa64-4089-a853-e2027658e237/manager/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.520266 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-nghm2_ffab19aa-8b8f-4067-b19c-3ccd9352cb12/kube-rbac-proxy/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.566261 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-nghm2_ffab19aa-8b8f-4067-b19c-3ccd9352cb12/manager/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.700489 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-kvfmc_8ce74c21-dde5-40bb-8c42-96e4165b8541/kube-rbac-proxy/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.793032 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-kvfmc_8ce74c21-dde5-40bb-8c42-96e4165b8541/manager/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.855392 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-mklbg_c1571b7d-f7d4-470d-90ac-d276a39ea2b1/kube-rbac-proxy/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.907533 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-mklbg_c1571b7d-f7d4-470d-90ac-d276a39ea2b1/manager/0.log" Sep 30 07:09:58 crc kubenswrapper[4956]: I0930 07:09:58.999015 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-ncr6h_8c84e3e7-f42f-46df-af16-516bc2cac4a0/kube-rbac-proxy/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.044147 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-ncr6h_8c84e3e7-f42f-46df-af16-516bc2cac4a0/manager/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.148619 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c7d9477-g85xg_89ba53cc-155b-485b-926c-83eaa0772764/kube-rbac-proxy/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.359372 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f589bc7f7-jvf6n_6ee14caa-a939-467a-bdbb-4160d336eaee/kube-rbac-proxy/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.381770 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c7d9477-g85xg_89ba53cc-155b-485b-926c-83eaa0772764/manager/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.420777 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f589bc7f7-jvf6n_6ee14caa-a939-467a-bdbb-4160d336eaee/manager/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.521159 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-6hssf_101087b5-cd1e-40f3-916f-5e8f5354ac2d/kube-rbac-proxy/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.596479 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-6hssf_101087b5-cd1e-40f3-916f-5e8f5354ac2d/manager/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.611369 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-m8xs8_124abac1-4adc-4a56-8d2b-241e0eb4bf57/kube-rbac-proxy/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.726623 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-m8xs8_124abac1-4adc-4a56-8d2b-241e0eb4bf57/manager/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.804418 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-nhj4d_1ed82f79-3f95-4293-937a-f5d82ce37f10/kube-rbac-proxy/0.log" Sep 30 07:09:59 crc kubenswrapper[4956]: I0930 07:09:59.814431 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-nhj4d_1ed82f79-3f95-4293-937a-f5d82ce37f10/manager/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.008874 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b96467f46-5x2c8_ca55e873-96fb-4348-ba98-58ab9648de78/kube-rbac-proxy/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.053789 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b96467f46-5x2c8_ca55e873-96fb-4348-ba98-58ab9648de78/manager/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.133936 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79f9fc9fd8-j79gf_cbc3fab0-8876-49a7-a85f-4844e253595f/kube-rbac-proxy/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.240680 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fb7d6b8bf-2rcv8_c07dffb1-ebd0-44e9-8061-ce680870aba3/kube-rbac-proxy/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.253783 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79f9fc9fd8-j79gf_cbc3fab0-8876-49a7-a85f-4844e253595f/manager/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.352712 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fb7d6b8bf-2rcv8_c07dffb1-ebd0-44e9-8061-ce680870aba3/manager/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.449727 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82_0b6d8a4b-faca-4779-be46-219d3c0a3e22/manager/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.472521 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86b7cb4c5fvmt82_0b6d8a4b-faca-4779-be46-219d3c0a3e22/kube-rbac-proxy/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.611774 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b7bb8bd67-7nhwl_2e369192-5374-4d18-954d-7d46ff60e9c1/kube-rbac-proxy/0.log" Sep 30 07:10:00 crc kubenswrapper[4956]: I0930 07:10:00.738058 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56dc567787-c5lxf_3a68c8b3-216d-4ba4-b841-054d52526caf/kube-rbac-proxy/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.036318 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56dc567787-c5lxf_3a68c8b3-216d-4ba4-b841-054d52526caf/operator/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.053304 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-v888v_c0c96af5-0c02-4dfd-91e5-947696cb4899/kube-rbac-proxy/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.066053 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4bg6n_dbd8b6e7-61f4-4a07-92ab-ed42a432df93/registry-server/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.253588 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-xh7wx_69909a1b-9121-45ae-aaeb-e63950300ec9/kube-rbac-proxy/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.267926 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-v888v_c0c96af5-0c02-4dfd-91e5-947696cb4899/manager/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.315236 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-xh7wx_69909a1b-9121-45ae-aaeb-e63950300ec9/manager/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.523458 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-657c6b68c7-6hgwr_f93913e6-5d74-4030-ac26-a10781a72db0/kube-rbac-proxy/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.559971 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-j2hpj_49d67534-20e0-48be-9614-eec49889c4a7/operator/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.742925 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-657c6b68c7-6hgwr_f93913e6-5d74-4030-ac26-a10781a72db0/manager/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.757478 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-wxgr6_9cdfca4b-0805-4ebc-92e1-906044d82e4b/kube-rbac-proxy/0.log" Sep 30 07:10:01 crc kubenswrapper[4956]: I0930 07:10:01.999788 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb97fcf96-jz8bg_2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f/manager/0.log" Sep 30 07:10:02 crc kubenswrapper[4956]: I0930 07:10:02.011638 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb97fcf96-jz8bg_2ddc99d7-6483-4ba5-b2f9-bd51a90ae36f/kube-rbac-proxy/0.log" Sep 30 07:10:02 crc kubenswrapper[4956]: I0930 07:10:02.024739 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b7bb8bd67-7nhwl_2e369192-5374-4d18-954d-7d46ff60e9c1/manager/0.log" Sep 30 07:10:02 crc kubenswrapper[4956]: I0930 07:10:02.107841 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-wxgr6_9cdfca4b-0805-4ebc-92e1-906044d82e4b/manager/0.log" Sep 30 07:10:02 crc kubenswrapper[4956]: I0930 07:10:02.181550 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75756dd4d9-sbn5x_f620cf06-9ba1-4866-9964-dc38e574c889/kube-rbac-proxy/0.log" Sep 30 07:10:02 crc kubenswrapper[4956]: I0930 07:10:02.228849 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75756dd4d9-sbn5x_f620cf06-9ba1-4866-9964-dc38e574c889/manager/0.log" Sep 30 07:10:16 crc kubenswrapper[4956]: I0930 07:10:16.520589 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pclsc_aa301454-b9a6-4bed-acd1-f2cf109b5259/control-plane-machine-set-operator/0.log" Sep 30 07:10:16 crc kubenswrapper[4956]: I0930 07:10:16.717661 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wjwhf_9ec8ec3c-f0d3-41b1-a311-2eca015cd63a/kube-rbac-proxy/0.log" Sep 30 07:10:16 crc kubenswrapper[4956]: I0930 07:10:16.748153 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wjwhf_9ec8ec3c-f0d3-41b1-a311-2eca015cd63a/machine-api-operator/0.log" Sep 30 07:10:18 crc kubenswrapper[4956]: I0930 07:10:18.073977 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:10:18 crc kubenswrapper[4956]: I0930 07:10:18.074476 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:10:28 crc kubenswrapper[4956]: I0930 07:10:28.184966 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-skkpn_528b17e7-a42f-4e5e-8731-4f3d84d59cf7/cert-manager-controller/0.log" Sep 30 07:10:28 crc kubenswrapper[4956]: I0930 07:10:28.302222 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-9jcp8_5384e744-0e0a-4743-bf15-cb75c35951ac/cert-manager-cainjector/0.log" Sep 30 07:10:28 crc kubenswrapper[4956]: I0930 07:10:28.379371 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-cmwsp_50748964-4222-40d7-a12c-6ab004bf8a77/cert-manager-webhook/0.log" Sep 30 07:10:39 crc kubenswrapper[4956]: I0930 07:10:39.621667 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-svf7v_7a6e9183-4fbf-4549-926d-d8a48c0d17ac/nmstate-console-plugin/0.log" Sep 30 07:10:39 crc kubenswrapper[4956]: I0930 07:10:39.862415 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dxkwj_1f61456f-3dcb-4831-9760-c06143ec9b14/nmstate-handler/0.log" Sep 30 07:10:39 crc kubenswrapper[4956]: I0930 07:10:39.890802 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-cfj2d_5c73bb61-b711-4efb-8ba7-de118a9b30e7/kube-rbac-proxy/0.log" Sep 30 07:10:39 crc kubenswrapper[4956]: I0930 07:10:39.919009 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-cfj2d_5c73bb61-b711-4efb-8ba7-de118a9b30e7/nmstate-metrics/0.log" Sep 30 07:10:40 crc kubenswrapper[4956]: I0930 07:10:40.061072 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-txhvp_34203d19-cae9-4ef7-863e-03f524e1a662/nmstate-operator/0.log" Sep 30 07:10:40 crc kubenswrapper[4956]: I0930 07:10:40.125633 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-vsdnt_8afb5dc0-a1aa-4c21-95b1-62c64b452ff1/nmstate-webhook/0.log" Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.073572 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.075016 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.075091 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.076512 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a72ba49612cd65b05f8abd486b9207dfc5c1a9a11fa2c0d5239532cc63509a0"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.076665 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://1a72ba49612cd65b05f8abd486b9207dfc5c1a9a11fa2c0d5239532cc63509a0" gracePeriod=600 Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.651461 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="1a72ba49612cd65b05f8abd486b9207dfc5c1a9a11fa2c0d5239532cc63509a0" exitCode=0 Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.651525 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"1a72ba49612cd65b05f8abd486b9207dfc5c1a9a11fa2c0d5239532cc63509a0"} Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.652760 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerStarted","Data":"e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42"} Sep 30 07:10:48 crc kubenswrapper[4956]: I0930 07:10:48.652801 4956 scope.go:117] "RemoveContainer" containerID="1bc1ecd489046b94dd505fb011d91e0f32a22e848262f9ccf5650133c117d223" Sep 30 07:10:53 crc kubenswrapper[4956]: I0930 07:10:53.416650 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ntsz7_0f07b182-685f-40c3-961e-eebfbf2d5fe5/kube-rbac-proxy/0.log" Sep 30 07:10:53 crc kubenswrapper[4956]: I0930 07:10:53.537690 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ntsz7_0f07b182-685f-40c3-961e-eebfbf2d5fe5/controller/0.log" Sep 30 07:10:53 crc kubenswrapper[4956]: I0930 07:10:53.668768 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-frr-files/0.log" Sep 30 07:10:53 crc kubenswrapper[4956]: I0930 07:10:53.761067 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-reloader/0.log" Sep 30 07:10:53 crc kubenswrapper[4956]: I0930 07:10:53.776582 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-frr-files/0.log" Sep 30 07:10:53 crc kubenswrapper[4956]: I0930 07:10:53.810760 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-metrics/0.log" Sep 30 07:10:53 crc kubenswrapper[4956]: I0930 07:10:53.840913 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-reloader/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.051708 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-metrics/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.081838 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-reloader/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.095487 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-frr-files/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.100479 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-metrics/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.282306 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-frr-files/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.287160 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-reloader/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.287160 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/cp-metrics/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.295685 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/controller/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.441396 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/frr-metrics/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.482723 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/kube-rbac-proxy/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.509757 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/kube-rbac-proxy-frr/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.685275 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/reloader/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.725783 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-7r8bt_26860b1e-eab3-4a86-b87d-3c52529f70dd/frr-k8s-webhook-server/0.log" Sep 30 07:10:54 crc kubenswrapper[4956]: I0930 07:10:54.996899 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7478c46d8f-h6xxf_1cc7609e-41b3-4fb2-98f4-6cc743299a2f/manager/0.log" Sep 30 07:10:55 crc kubenswrapper[4956]: I0930 07:10:55.134144 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6bbf45b88f-6bpm6_889c063f-2550-48ed-957c-150f8f1192e3/webhook-server/0.log" Sep 30 07:10:55 crc kubenswrapper[4956]: I0930 07:10:55.234630 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5pvg9_c59b762c-1f05-46c5-8d6d-2bf39a8592f0/kube-rbac-proxy/0.log" Sep 30 07:10:55 crc kubenswrapper[4956]: I0930 07:10:55.929879 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5pvg9_c59b762c-1f05-46c5-8d6d-2bf39a8592f0/speaker/0.log" Sep 30 07:10:56 crc kubenswrapper[4956]: I0930 07:10:56.189467 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5mnk7_809472fb-9344-46be-81f4-808a1fcc16c6/frr/0.log" Sep 30 07:11:06 crc kubenswrapper[4956]: I0930 07:11:06.666243 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/util/0.log" Sep 30 07:11:06 crc kubenswrapper[4956]: I0930 07:11:06.886092 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/pull/0.log" Sep 30 07:11:06 crc kubenswrapper[4956]: I0930 07:11:06.915947 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/util/0.log" Sep 30 07:11:06 crc kubenswrapper[4956]: I0930 07:11:06.930708 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/pull/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.089916 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/util/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.092542 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/extract/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.094791 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvhnkz_7dd636e2-36d8-47b0-a01c-26852a43d3b3/pull/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.252278 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/util/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.392971 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/pull/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.415929 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/util/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.450684 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/pull/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.581381 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/util/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.581739 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/pull/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.588280 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djdq7k_50a2fa9e-1d17-41de-af81-c06f0afdf170/extract/0.log" Sep 30 07:11:07 crc kubenswrapper[4956]: I0930 07:11:07.889936 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-utilities/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.055662 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-utilities/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.077783 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-content/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.134035 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-content/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.274428 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-utilities/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.293324 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/extract-content/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.474585 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-utilities/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.754455 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-utilities/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.778661 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-content/0.log" Sep 30 07:11:08 crc kubenswrapper[4956]: I0930 07:11:08.835597 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-content/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.053955 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-content/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.079898 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/extract-utilities/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.114991 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k2bs6_23fe685f-5eac-43e5-b4fc-c48e85553142/registry-server/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.270980 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/util/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.554018 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/pull/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.593753 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/pull/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.599730 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/util/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.868050 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nsq7j_e97b25d9-fabb-4c79-a586-c0fa9c73ee2d/registry-server/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.876254 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/util/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.912077 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/pull/0.log" Sep 30 07:11:09 crc kubenswrapper[4956]: I0930 07:11:09.919607 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d967nzgh_5435ca6f-5e89-4884-ab53-526b66bf688e/extract/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.100905 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2zl8q_747a9b33-025d-4b52-9b54-7d1b829c6cef/marketplace-operator/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.194470 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-utilities/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.318911 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-utilities/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.343024 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-content/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.345679 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-content/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.546059 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-content/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.571617 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/extract-utilities/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.622053 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-utilities/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.709699 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rjpbl_4b003c5c-9c5c-4732-9370-c49aed57a7c2/registry-server/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.852997 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-utilities/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.909628 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-content/0.log" Sep 30 07:11:10 crc kubenswrapper[4956]: I0930 07:11:10.930373 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-content/0.log" Sep 30 07:11:11 crc kubenswrapper[4956]: I0930 07:11:11.067222 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-utilities/0.log" Sep 30 07:11:11 crc kubenswrapper[4956]: I0930 07:11:11.069315 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/extract-content/0.log" Sep 30 07:11:11 crc kubenswrapper[4956]: I0930 07:11:11.788419 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q9m7s_de4dac4b-b680-4e12-beb2-a999348eddd7/registry-server/0.log" Sep 30 07:11:22 crc kubenswrapper[4956]: I0930 07:11:22.882873 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-vl7fv_45c3115a-12d7-4cd7-83a8-f9a720e63ce6/prometheus-operator/0.log" Sep 30 07:11:23 crc kubenswrapper[4956]: I0930 07:11:23.046996 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5f8b585c84-4hbf5_0dd88442-b29e-47e9-b221-57ac09bbc7cb/prometheus-operator-admission-webhook/0.log" Sep 30 07:11:23 crc kubenswrapper[4956]: I0930 07:11:23.081983 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5f8b585c84-8p69b_868bdc73-4a3b-49ec-9676-0d98a950e1ed/prometheus-operator-admission-webhook/0.log" Sep 30 07:11:23 crc kubenswrapper[4956]: I0930 07:11:23.245810 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-pv5cs_f9203661-7a5b-45cd-9057-78b70739a89b/operator/0.log" Sep 30 07:11:23 crc kubenswrapper[4956]: I0930 07:11:23.293738 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-j6v4z_d32f519a-014e-43e2-b715-78e9fd9197c3/perses-operator/0.log" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.594165 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smvh4"] Sep 30 07:12:44 crc kubenswrapper[4956]: E0930 07:12:44.595598 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ed2199-27ea-4280-8dad-f1a87a108724" containerName="container-00" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.595617 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ed2199-27ea-4280-8dad-f1a87a108724" containerName="container-00" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.595898 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ed2199-27ea-4280-8dad-f1a87a108724" containerName="container-00" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.597601 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.620511 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smvh4"] Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.790295 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-utilities\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.790347 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhz5\" (UniqueName: \"kubernetes.io/projected/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-kube-api-access-mmhz5\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.790377 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-catalog-content\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.893026 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-utilities\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.893085 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhz5\" (UniqueName: \"kubernetes.io/projected/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-kube-api-access-mmhz5\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.893202 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-catalog-content\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.893749 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-utilities\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.893822 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-catalog-content\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.916089 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhz5\" (UniqueName: \"kubernetes.io/projected/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-kube-api-access-mmhz5\") pod \"redhat-operators-smvh4\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:44 crc kubenswrapper[4956]: I0930 07:12:44.928224 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:45 crc kubenswrapper[4956]: I0930 07:12:45.442904 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smvh4"] Sep 30 07:12:45 crc kubenswrapper[4956]: I0930 07:12:45.775211 4956 generic.go:334] "Generic (PLEG): container finished" podID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerID="4f8d9a3ee01c1ca85f0554cccc2ddb0bcd75f14fcfbe857430a94afc63c87854" exitCode=0 Sep 30 07:12:45 crc kubenswrapper[4956]: I0930 07:12:45.775358 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smvh4" event={"ID":"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea","Type":"ContainerDied","Data":"4f8d9a3ee01c1ca85f0554cccc2ddb0bcd75f14fcfbe857430a94afc63c87854"} Sep 30 07:12:45 crc kubenswrapper[4956]: I0930 07:12:45.775517 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smvh4" event={"ID":"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea","Type":"ContainerStarted","Data":"21f9f205d003c22262319091b631e127f888297a9f148106938091a06f43ef78"} Sep 30 07:12:45 crc kubenswrapper[4956]: I0930 07:12:45.777474 4956 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:12:47 crc kubenswrapper[4956]: I0930 07:12:47.805048 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smvh4" event={"ID":"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea","Type":"ContainerStarted","Data":"d0121b63729d6540bfedb61ddef1236a3526527f96de2c1b7c76eaac927e3851"} Sep 30 07:12:48 crc kubenswrapper[4956]: I0930 07:12:48.074493 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:12:48 crc kubenswrapper[4956]: I0930 07:12:48.074578 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:12:48 crc kubenswrapper[4956]: I0930 07:12:48.816163 4956 generic.go:334] "Generic (PLEG): container finished" podID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerID="d0121b63729d6540bfedb61ddef1236a3526527f96de2c1b7c76eaac927e3851" exitCode=0 Sep 30 07:12:48 crc kubenswrapper[4956]: I0930 07:12:48.816211 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smvh4" event={"ID":"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea","Type":"ContainerDied","Data":"d0121b63729d6540bfedb61ddef1236a3526527f96de2c1b7c76eaac927e3851"} Sep 30 07:12:49 crc kubenswrapper[4956]: I0930 07:12:49.826999 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smvh4" event={"ID":"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea","Type":"ContainerStarted","Data":"90c79295811b946c0031c50bbdcd93c82257945fefcc5f2c66793b78eb9ebaad"} Sep 30 07:12:49 crc kubenswrapper[4956]: I0930 07:12:49.849092 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smvh4" podStartSLOduration=2.174212767 podStartE2EDuration="5.849071311s" podCreationTimestamp="2025-09-30 07:12:44 +0000 UTC" firstStartedPulling="2025-09-30 07:12:45.777184281 +0000 UTC m=+6236.104304806" lastFinishedPulling="2025-09-30 07:12:49.452042825 +0000 UTC m=+6239.779163350" observedRunningTime="2025-09-30 07:12:49.842507365 +0000 UTC m=+6240.169627880" watchObservedRunningTime="2025-09-30 07:12:49.849071311 +0000 UTC m=+6240.176191836" Sep 30 07:12:54 crc kubenswrapper[4956]: I0930 07:12:54.928790 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:54 crc kubenswrapper[4956]: I0930 07:12:54.929339 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:54 crc kubenswrapper[4956]: I0930 07:12:54.986989 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:55 crc kubenswrapper[4956]: I0930 07:12:55.930758 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:55 crc kubenswrapper[4956]: I0930 07:12:55.993458 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smvh4"] Sep 30 07:12:57 crc kubenswrapper[4956]: I0930 07:12:57.895809 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smvh4" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerName="registry-server" containerID="cri-o://90c79295811b946c0031c50bbdcd93c82257945fefcc5f2c66793b78eb9ebaad" gracePeriod=2 Sep 30 07:12:58 crc kubenswrapper[4956]: I0930 07:12:58.905160 4956 generic.go:334] "Generic (PLEG): container finished" podID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerID="90c79295811b946c0031c50bbdcd93c82257945fefcc5f2c66793b78eb9ebaad" exitCode=0 Sep 30 07:12:58 crc kubenswrapper[4956]: I0930 07:12:58.905440 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smvh4" event={"ID":"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea","Type":"ContainerDied","Data":"90c79295811b946c0031c50bbdcd93c82257945fefcc5f2c66793b78eb9ebaad"} Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.495579 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.519454 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-catalog-content\") pod \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.519984 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-utilities\") pod \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.520155 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmhz5\" (UniqueName: \"kubernetes.io/projected/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-kube-api-access-mmhz5\") pod \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\" (UID: \"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea\") " Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.523000 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-utilities" (OuterVolumeSpecName: "utilities") pod "bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" (UID: "bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.527102 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-kube-api-access-mmhz5" (OuterVolumeSpecName: "kube-api-access-mmhz5") pod "bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" (UID: "bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea"). InnerVolumeSpecName "kube-api-access-mmhz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.616189 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" (UID: "bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.623697 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.623803 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmhz5\" (UniqueName: \"kubernetes.io/projected/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-kube-api-access-mmhz5\") on node \"crc\" DevicePath \"\"" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.623885 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.920367 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smvh4" event={"ID":"bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea","Type":"ContainerDied","Data":"21f9f205d003c22262319091b631e127f888297a9f148106938091a06f43ef78"} Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.920424 4956 scope.go:117] "RemoveContainer" containerID="90c79295811b946c0031c50bbdcd93c82257945fefcc5f2c66793b78eb9ebaad" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.920615 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smvh4" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.949875 4956 scope.go:117] "RemoveContainer" containerID="d0121b63729d6540bfedb61ddef1236a3526527f96de2c1b7c76eaac927e3851" Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.972201 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smvh4"] Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.983877 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smvh4"] Sep 30 07:12:59 crc kubenswrapper[4956]: I0930 07:12:59.985946 4956 scope.go:117] "RemoveContainer" containerID="4f8d9a3ee01c1ca85f0554cccc2ddb0bcd75f14fcfbe857430a94afc63c87854" Sep 30 07:13:00 crc kubenswrapper[4956]: I0930 07:13:00.358061 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" path="/var/lib/kubelet/pods/bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea/volumes" Sep 30 07:13:18 crc kubenswrapper[4956]: I0930 07:13:18.073579 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:13:18 crc kubenswrapper[4956]: I0930 07:13:18.074516 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.073359 4956 patch_prober.go:28] interesting pod/machine-config-daemon-hx8cm container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.074072 4956 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.074170 4956 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.075419 4956 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42"} pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.075516 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerName="machine-config-daemon" containerID="cri-o://e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" gracePeriod=600 Sep 30 07:13:48 crc kubenswrapper[4956]: E0930 07:13:48.192905 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.413730 4956 generic.go:334] "Generic (PLEG): container finished" podID="5ecd015b-e216-40d8-ae78-711b2a65c193" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" exitCode=0 Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.413784 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" event={"ID":"5ecd015b-e216-40d8-ae78-711b2a65c193","Type":"ContainerDied","Data":"e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42"} Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.413908 4956 scope.go:117] "RemoveContainer" containerID="1a72ba49612cd65b05f8abd486b9207dfc5c1a9a11fa2c0d5239532cc63509a0" Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.415554 4956 scope.go:117] "RemoveContainer" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" Sep 30 07:13:48 crc kubenswrapper[4956]: E0930 07:13:48.416104 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.416934 4956 generic.go:334] "Generic (PLEG): container finished" podID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerID="41edb3d7cbfe7b5413100d95ab4faf2a04343c542b8c5d46c6cf2f208fb374ff" exitCode=0 Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.416975 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv6ft/must-gather-5ls46" event={"ID":"c55bd359-a76e-48ab-8fa1-0c402351b7b0","Type":"ContainerDied","Data":"41edb3d7cbfe7b5413100d95ab4faf2a04343c542b8c5d46c6cf2f208fb374ff"} Sep 30 07:13:48 crc kubenswrapper[4956]: I0930 07:13:48.418079 4956 scope.go:117] "RemoveContainer" containerID="41edb3d7cbfe7b5413100d95ab4faf2a04343c542b8c5d46c6cf2f208fb374ff" Sep 30 07:13:49 crc kubenswrapper[4956]: I0930 07:13:49.344573 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mv6ft_must-gather-5ls46_c55bd359-a76e-48ab-8fa1-0c402351b7b0/gather/0.log" Sep 30 07:13:56 crc kubenswrapper[4956]: E0930 07:13:56.259415 4956 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.82:53524->38.102.83.82:43469: write tcp 38.102.83.82:53524->38.102.83.82:43469: write: broken pipe Sep 30 07:14:01 crc kubenswrapper[4956]: I0930 07:14:01.341273 4956 scope.go:117] "RemoveContainer" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" Sep 30 07:14:01 crc kubenswrapper[4956]: E0930 07:14:01.342146 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:14:02 crc kubenswrapper[4956]: I0930 07:14:02.418595 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mv6ft/must-gather-5ls46"] Sep 30 07:14:02 crc kubenswrapper[4956]: I0930 07:14:02.419253 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mv6ft/must-gather-5ls46" podUID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerName="copy" containerID="cri-o://d8a7ef2d7a5bcdc7f5f4ede90ea59ded5178e1f0e05d9d1ca381cbec94519b07" gracePeriod=2 Sep 30 07:14:02 crc kubenswrapper[4956]: I0930 07:14:02.428160 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mv6ft/must-gather-5ls46"] Sep 30 07:14:02 crc kubenswrapper[4956]: I0930 07:14:02.552616 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mv6ft_must-gather-5ls46_c55bd359-a76e-48ab-8fa1-0c402351b7b0/copy/0.log" Sep 30 07:14:02 crc kubenswrapper[4956]: I0930 07:14:02.553206 4956 generic.go:334] "Generic (PLEG): container finished" podID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerID="d8a7ef2d7a5bcdc7f5f4ede90ea59ded5178e1f0e05d9d1ca381cbec94519b07" exitCode=143 Sep 30 07:14:02 crc kubenswrapper[4956]: I0930 07:14:02.919735 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mv6ft_must-gather-5ls46_c55bd359-a76e-48ab-8fa1-0c402351b7b0/copy/0.log" Sep 30 07:14:02 crc kubenswrapper[4956]: I0930 07:14:02.921957 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.000573 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w84b\" (UniqueName: \"kubernetes.io/projected/c55bd359-a76e-48ab-8fa1-0c402351b7b0-kube-api-access-2w84b\") pod \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\" (UID: \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\") " Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.000798 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c55bd359-a76e-48ab-8fa1-0c402351b7b0-must-gather-output\") pod \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\" (UID: \"c55bd359-a76e-48ab-8fa1-0c402351b7b0\") " Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.007289 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55bd359-a76e-48ab-8fa1-0c402351b7b0-kube-api-access-2w84b" (OuterVolumeSpecName: "kube-api-access-2w84b") pod "c55bd359-a76e-48ab-8fa1-0c402351b7b0" (UID: "c55bd359-a76e-48ab-8fa1-0c402351b7b0"). InnerVolumeSpecName "kube-api-access-2w84b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.103083 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w84b\" (UniqueName: \"kubernetes.io/projected/c55bd359-a76e-48ab-8fa1-0c402351b7b0-kube-api-access-2w84b\") on node \"crc\" DevicePath \"\"" Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.188058 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55bd359-a76e-48ab-8fa1-0c402351b7b0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c55bd359-a76e-48ab-8fa1-0c402351b7b0" (UID: "c55bd359-a76e-48ab-8fa1-0c402351b7b0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.204735 4956 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c55bd359-a76e-48ab-8fa1-0c402351b7b0-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.579714 4956 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mv6ft_must-gather-5ls46_c55bd359-a76e-48ab-8fa1-0c402351b7b0/copy/0.log" Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.607195 4956 scope.go:117] "RemoveContainer" containerID="d8a7ef2d7a5bcdc7f5f4ede90ea59ded5178e1f0e05d9d1ca381cbec94519b07" Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.607309 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv6ft/must-gather-5ls46" Sep 30 07:14:03 crc kubenswrapper[4956]: I0930 07:14:03.700236 4956 scope.go:117] "RemoveContainer" containerID="41edb3d7cbfe7b5413100d95ab4faf2a04343c542b8c5d46c6cf2f208fb374ff" Sep 30 07:14:04 crc kubenswrapper[4956]: I0930 07:14:04.354853 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" path="/var/lib/kubelet/pods/c55bd359-a76e-48ab-8fa1-0c402351b7b0/volumes" Sep 30 07:14:12 crc kubenswrapper[4956]: I0930 07:14:12.340904 4956 scope.go:117] "RemoveContainer" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" Sep 30 07:14:12 crc kubenswrapper[4956]: E0930 07:14:12.341723 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:14:25 crc kubenswrapper[4956]: I0930 07:14:25.341360 4956 scope.go:117] "RemoveContainer" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" Sep 30 07:14:25 crc kubenswrapper[4956]: E0930 07:14:25.342098 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:14:37 crc kubenswrapper[4956]: I0930 07:14:37.341511 4956 scope.go:117] "RemoveContainer" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" Sep 30 07:14:37 crc kubenswrapper[4956]: E0930 07:14:37.342316 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:14:51 crc kubenswrapper[4956]: I0930 07:14:51.342365 4956 scope.go:117] "RemoveContainer" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" Sep 30 07:14:51 crc kubenswrapper[4956]: E0930 07:14:51.343510 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.142144 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69"] Sep 30 07:15:00 crc kubenswrapper[4956]: E0930 07:15:00.143110 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerName="copy" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.143143 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerName="copy" Sep 30 07:15:00 crc kubenswrapper[4956]: E0930 07:15:00.143183 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerName="gather" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.143190 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerName="gather" Sep 30 07:15:00 crc kubenswrapper[4956]: E0930 07:15:00.143210 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerName="extract-utilities" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.143218 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerName="extract-utilities" Sep 30 07:15:00 crc kubenswrapper[4956]: E0930 07:15:00.143230 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerName="extract-content" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.143235 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerName="extract-content" Sep 30 07:15:00 crc kubenswrapper[4956]: E0930 07:15:00.143248 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerName="registry-server" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.143254 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerName="registry-server" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.143440 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerName="gather" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.143461 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55bd359-a76e-48ab-8fa1-0c402351b7b0" containerName="copy" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.143473 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2a1afd-5333-4f80-84c3-9bea4ad3b3ea" containerName="registry-server" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.144518 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.147862 4956 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.147921 4956 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.161863 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69"] Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.328304 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f348a5-c724-4428-9d1a-320ae7a31716-secret-volume\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.328552 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nptxq\" (UniqueName: \"kubernetes.io/projected/d8f348a5-c724-4428-9d1a-320ae7a31716-kube-api-access-nptxq\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.329188 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f348a5-c724-4428-9d1a-320ae7a31716-config-volume\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.431405 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f348a5-c724-4428-9d1a-320ae7a31716-secret-volume\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.431501 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nptxq\" (UniqueName: \"kubernetes.io/projected/d8f348a5-c724-4428-9d1a-320ae7a31716-kube-api-access-nptxq\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.431612 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f348a5-c724-4428-9d1a-320ae7a31716-config-volume\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.432539 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f348a5-c724-4428-9d1a-320ae7a31716-config-volume\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.441338 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f348a5-c724-4428-9d1a-320ae7a31716-secret-volume\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.450401 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nptxq\" (UniqueName: \"kubernetes.io/projected/d8f348a5-c724-4428-9d1a-320ae7a31716-kube-api-access-nptxq\") pod \"collect-profiles-29320275-nsj69\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.477705 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:00 crc kubenswrapper[4956]: I0930 07:15:00.927461 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69"] Sep 30 07:15:01 crc kubenswrapper[4956]: I0930 07:15:01.173034 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" event={"ID":"d8f348a5-c724-4428-9d1a-320ae7a31716","Type":"ContainerStarted","Data":"614a9fbd748882c8a188b1af3cfa72d930dd29f12e84c03e0644bab416e0227c"} Sep 30 07:15:01 crc kubenswrapper[4956]: I0930 07:15:01.173432 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" event={"ID":"d8f348a5-c724-4428-9d1a-320ae7a31716","Type":"ContainerStarted","Data":"7f04b552311848c4319582682b865b42248d50e0b35ffaa033c6e73c51c4b150"} Sep 30 07:15:01 crc kubenswrapper[4956]: I0930 07:15:01.194017 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" podStartSLOduration=1.193994593 podStartE2EDuration="1.193994593s" podCreationTimestamp="2025-09-30 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:15:01.187360825 +0000 UTC m=+6371.514481370" watchObservedRunningTime="2025-09-30 07:15:01.193994593 +0000 UTC m=+6371.521115138" Sep 30 07:15:02 crc kubenswrapper[4956]: I0930 07:15:02.182480 4956 generic.go:334] "Generic (PLEG): container finished" podID="d8f348a5-c724-4428-9d1a-320ae7a31716" containerID="614a9fbd748882c8a188b1af3cfa72d930dd29f12e84c03e0644bab416e0227c" exitCode=0 Sep 30 07:15:02 crc kubenswrapper[4956]: I0930 07:15:02.182540 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" event={"ID":"d8f348a5-c724-4428-9d1a-320ae7a31716","Type":"ContainerDied","Data":"614a9fbd748882c8a188b1af3cfa72d930dd29f12e84c03e0644bab416e0227c"} Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.518826 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.592212 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f348a5-c724-4428-9d1a-320ae7a31716-secret-volume\") pod \"d8f348a5-c724-4428-9d1a-320ae7a31716\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.592290 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f348a5-c724-4428-9d1a-320ae7a31716-config-volume\") pod \"d8f348a5-c724-4428-9d1a-320ae7a31716\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.592367 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nptxq\" (UniqueName: \"kubernetes.io/projected/d8f348a5-c724-4428-9d1a-320ae7a31716-kube-api-access-nptxq\") pod \"d8f348a5-c724-4428-9d1a-320ae7a31716\" (UID: \"d8f348a5-c724-4428-9d1a-320ae7a31716\") " Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.593822 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f348a5-c724-4428-9d1a-320ae7a31716-config-volume" (OuterVolumeSpecName: "config-volume") pod "d8f348a5-c724-4428-9d1a-320ae7a31716" (UID: "d8f348a5-c724-4428-9d1a-320ae7a31716"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.599431 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f348a5-c724-4428-9d1a-320ae7a31716-kube-api-access-nptxq" (OuterVolumeSpecName: "kube-api-access-nptxq") pod "d8f348a5-c724-4428-9d1a-320ae7a31716" (UID: "d8f348a5-c724-4428-9d1a-320ae7a31716"). InnerVolumeSpecName "kube-api-access-nptxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.600468 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f348a5-c724-4428-9d1a-320ae7a31716-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d8f348a5-c724-4428-9d1a-320ae7a31716" (UID: "d8f348a5-c724-4428-9d1a-320ae7a31716"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.694883 4956 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8f348a5-c724-4428-9d1a-320ae7a31716-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.694925 4956 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f348a5-c724-4428-9d1a-320ae7a31716-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:03 crc kubenswrapper[4956]: I0930 07:15:03.694941 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nptxq\" (UniqueName: \"kubernetes.io/projected/d8f348a5-c724-4428-9d1a-320ae7a31716-kube-api-access-nptxq\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:04 crc kubenswrapper[4956]: I0930 07:15:04.205085 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" event={"ID":"d8f348a5-c724-4428-9d1a-320ae7a31716","Type":"ContainerDied","Data":"7f04b552311848c4319582682b865b42248d50e0b35ffaa033c6e73c51c4b150"} Sep 30 07:15:04 crc kubenswrapper[4956]: I0930 07:15:04.205347 4956 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f04b552311848c4319582682b865b42248d50e0b35ffaa033c6e73c51c4b150" Sep 30 07:15:04 crc kubenswrapper[4956]: I0930 07:15:04.205112 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-nsj69" Sep 30 07:15:04 crc kubenswrapper[4956]: I0930 07:15:04.259226 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6"] Sep 30 07:15:04 crc kubenswrapper[4956]: I0930 07:15:04.268477 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-v2pz6"] Sep 30 07:15:04 crc kubenswrapper[4956]: I0930 07:15:04.351354 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6831bf62-b6e7-420c-9f69-bce3e2921222" path="/var/lib/kubelet/pods/6831bf62-b6e7-420c-9f69-bce3e2921222/volumes" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.680509 4956 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l8zzd"] Sep 30 07:15:05 crc kubenswrapper[4956]: E0930 07:15:05.681444 4956 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f348a5-c724-4428-9d1a-320ae7a31716" containerName="collect-profiles" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.681463 4956 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f348a5-c724-4428-9d1a-320ae7a31716" containerName="collect-profiles" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.681726 4956 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f348a5-c724-4428-9d1a-320ae7a31716" containerName="collect-profiles" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.683832 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.690206 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l8zzd"] Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.736564 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d22845a8-ea87-4258-b956-a668ff7e422d-utilities\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.736659 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d22845a8-ea87-4258-b956-a668ff7e422d-catalog-content\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.736829 4956 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bcrg\" (UniqueName: \"kubernetes.io/projected/d22845a8-ea87-4258-b956-a668ff7e422d-kube-api-access-9bcrg\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.839105 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bcrg\" (UniqueName: \"kubernetes.io/projected/d22845a8-ea87-4258-b956-a668ff7e422d-kube-api-access-9bcrg\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.839629 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d22845a8-ea87-4258-b956-a668ff7e422d-utilities\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.839682 4956 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d22845a8-ea87-4258-b956-a668ff7e422d-catalog-content\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.840216 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d22845a8-ea87-4258-b956-a668ff7e422d-utilities\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.840330 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d22845a8-ea87-4258-b956-a668ff7e422d-catalog-content\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:05 crc kubenswrapper[4956]: I0930 07:15:05.859945 4956 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bcrg\" (UniqueName: \"kubernetes.io/projected/d22845a8-ea87-4258-b956-a668ff7e422d-kube-api-access-9bcrg\") pod \"certified-operators-l8zzd\" (UID: \"d22845a8-ea87-4258-b956-a668ff7e422d\") " pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:06 crc kubenswrapper[4956]: I0930 07:15:06.019473 4956 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:06 crc kubenswrapper[4956]: I0930 07:15:06.341038 4956 scope.go:117] "RemoveContainer" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" Sep 30 07:15:06 crc kubenswrapper[4956]: E0930 07:15:06.341647 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:15:06 crc kubenswrapper[4956]: I0930 07:15:06.516747 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l8zzd"] Sep 30 07:15:07 crc kubenswrapper[4956]: I0930 07:15:07.249304 4956 generic.go:334] "Generic (PLEG): container finished" podID="d22845a8-ea87-4258-b956-a668ff7e422d" containerID="4d2bdfe10e6ae9543765b90c08f6465b9be40eb404c19833a81c1093568dbb8e" exitCode=0 Sep 30 07:15:07 crc kubenswrapper[4956]: I0930 07:15:07.249404 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8zzd" event={"ID":"d22845a8-ea87-4258-b956-a668ff7e422d","Type":"ContainerDied","Data":"4d2bdfe10e6ae9543765b90c08f6465b9be40eb404c19833a81c1093568dbb8e"} Sep 30 07:15:07 crc kubenswrapper[4956]: I0930 07:15:07.249603 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8zzd" event={"ID":"d22845a8-ea87-4258-b956-a668ff7e422d","Type":"ContainerStarted","Data":"8bc2a01706bd99cdf296ebb32de455aa55b81c43fac61b717104a0d7622c628c"} Sep 30 07:15:11 crc kubenswrapper[4956]: I0930 07:15:11.286156 4956 generic.go:334] "Generic (PLEG): container finished" podID="d22845a8-ea87-4258-b956-a668ff7e422d" containerID="a17b4ebb27443330ebe455209a91f1d045d301d4c825df3532157e2030bee5c3" exitCode=0 Sep 30 07:15:11 crc kubenswrapper[4956]: I0930 07:15:11.286224 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8zzd" event={"ID":"d22845a8-ea87-4258-b956-a668ff7e422d","Type":"ContainerDied","Data":"a17b4ebb27443330ebe455209a91f1d045d301d4c825df3532157e2030bee5c3"} Sep 30 07:15:12 crc kubenswrapper[4956]: I0930 07:15:12.297734 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l8zzd" event={"ID":"d22845a8-ea87-4258-b956-a668ff7e422d","Type":"ContainerStarted","Data":"2bdeb847d75240664d488da95b36ac4fdf2a0a1e3f9dff0706b6e94c2dad702d"} Sep 30 07:15:12 crc kubenswrapper[4956]: I0930 07:15:12.317856 4956 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l8zzd" podStartSLOduration=2.815261988 podStartE2EDuration="7.317839958s" podCreationTimestamp="2025-09-30 07:15:05 +0000 UTC" firstStartedPulling="2025-09-30 07:15:07.252725194 +0000 UTC m=+6377.579845719" lastFinishedPulling="2025-09-30 07:15:11.755303154 +0000 UTC m=+6382.082423689" observedRunningTime="2025-09-30 07:15:12.316802425 +0000 UTC m=+6382.643922980" watchObservedRunningTime="2025-09-30 07:15:12.317839958 +0000 UTC m=+6382.644960483" Sep 30 07:15:16 crc kubenswrapper[4956]: I0930 07:15:16.020776 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:16 crc kubenswrapper[4956]: I0930 07:15:16.021313 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:16 crc kubenswrapper[4956]: I0930 07:15:16.070946 4956 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:16 crc kubenswrapper[4956]: I0930 07:15:16.414667 4956 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l8zzd" Sep 30 07:15:16 crc kubenswrapper[4956]: I0930 07:15:16.495699 4956 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l8zzd"] Sep 30 07:15:16 crc kubenswrapper[4956]: I0930 07:15:16.553517 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k2bs6"] Sep 30 07:15:16 crc kubenswrapper[4956]: I0930 07:15:16.553790 4956 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k2bs6" podUID="23fe685f-5eac-43e5-b4fc-c48e85553142" containerName="registry-server" containerID="cri-o://e93bf60e387308ee14c39595df27547e8d8f0634ce2ccac5af62b3b4577d2715" gracePeriod=2 Sep 30 07:15:17 crc kubenswrapper[4956]: I0930 07:15:17.373248 4956 generic.go:334] "Generic (PLEG): container finished" podID="23fe685f-5eac-43e5-b4fc-c48e85553142" containerID="e93bf60e387308ee14c39595df27547e8d8f0634ce2ccac5af62b3b4577d2715" exitCode=0 Sep 30 07:15:17 crc kubenswrapper[4956]: I0930 07:15:17.375336 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bs6" event={"ID":"23fe685f-5eac-43e5-b4fc-c48e85553142","Type":"ContainerDied","Data":"e93bf60e387308ee14c39595df27547e8d8f0634ce2ccac5af62b3b4577d2715"} Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.005231 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.190602 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hhpk\" (UniqueName: \"kubernetes.io/projected/23fe685f-5eac-43e5-b4fc-c48e85553142-kube-api-access-9hhpk\") pod \"23fe685f-5eac-43e5-b4fc-c48e85553142\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.190758 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-utilities\") pod \"23fe685f-5eac-43e5-b4fc-c48e85553142\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.190850 4956 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-catalog-content\") pod \"23fe685f-5eac-43e5-b4fc-c48e85553142\" (UID: \"23fe685f-5eac-43e5-b4fc-c48e85553142\") " Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.191451 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-utilities" (OuterVolumeSpecName: "utilities") pod "23fe685f-5eac-43e5-b4fc-c48e85553142" (UID: "23fe685f-5eac-43e5-b4fc-c48e85553142"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.198614 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fe685f-5eac-43e5-b4fc-c48e85553142-kube-api-access-9hhpk" (OuterVolumeSpecName: "kube-api-access-9hhpk") pod "23fe685f-5eac-43e5-b4fc-c48e85553142" (UID: "23fe685f-5eac-43e5-b4fc-c48e85553142"). InnerVolumeSpecName "kube-api-access-9hhpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.268349 4956 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23fe685f-5eac-43e5-b4fc-c48e85553142" (UID: "23fe685f-5eac-43e5-b4fc-c48e85553142"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.293778 4956 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.293823 4956 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fe685f-5eac-43e5-b4fc-c48e85553142-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.293840 4956 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hhpk\" (UniqueName: \"kubernetes.io/projected/23fe685f-5eac-43e5-b4fc-c48e85553142-kube-api-access-9hhpk\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.384774 4956 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2bs6" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.384819 4956 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bs6" event={"ID":"23fe685f-5eac-43e5-b4fc-c48e85553142","Type":"ContainerDied","Data":"2b3bc720657a92d4954ecfc5f0c440ee03285e0662f09ea193a6a0e22204e3cb"} Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.384909 4956 scope.go:117] "RemoveContainer" containerID="e93bf60e387308ee14c39595df27547e8d8f0634ce2ccac5af62b3b4577d2715" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.420275 4956 scope.go:117] "RemoveContainer" containerID="a0df705ba135beb14501c9835257f8b069e2c8f752763ab543345307dc098921" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.420408 4956 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k2bs6"] Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.428781 4956 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k2bs6"] Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.443875 4956 scope.go:117] "RemoveContainer" containerID="9ef971135706cda34fd6e50b70f0e2b967d33ec1a50dd4d08a9055547376358d" Sep 30 07:15:18 crc kubenswrapper[4956]: I0930 07:15:18.973181 4956 scope.go:117] "RemoveContainer" containerID="e4a5e80e6551c1f52c1625352c1a80cf2c7b189dfe33c6922430681d85dc4f51" Sep 30 07:15:20 crc kubenswrapper[4956]: I0930 07:15:20.348828 4956 scope.go:117] "RemoveContainer" containerID="e0d9b57bcefb7db923f0eb3e1c7b3cfb622233cbcd78679d7d26d3793fb22c42" Sep 30 07:15:20 crc kubenswrapper[4956]: E0930 07:15:20.349525 4956 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hx8cm_openshift-machine-config-operator(5ecd015b-e216-40d8-ae78-711b2a65c193)\"" pod="openshift-machine-config-operator/machine-config-daemon-hx8cm" podUID="5ecd015b-e216-40d8-ae78-711b2a65c193" Sep 30 07:15:20 crc kubenswrapper[4956]: I0930 07:15:20.358639 4956 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fe685f-5eac-43e5-b4fc-c48e85553142" path="/var/lib/kubelet/pods/23fe685f-5eac-43e5-b4fc-c48e85553142/volumes"